Thursday, March 22, 2012

ASP Memory Crash

Hi all,
I am running reporting services on a dedicated server with 2GB of memory,
when ever a report is run that contains over 90,000 records the server
crashes and restarts. We have looked at the reports, and are to reduce the
criteria range to make smaller reports, but it then affects role the system
was designed to play in my organisation. If anyone has come across these
memory crashes and come across any ways of resolving them or ways that
haven't resolved them, including an increase in the memory size, it would be
greatly appreciated if you could reply. We are willing and able to throw
another 2GB at the problem, but are concerned this may only be a short term
solution and the problem will reoccur when the reporting size reaches 180,000
records.
Any assistance you may be able to provide would be greatly appreciatedIf you mean that the result set has 90,000 records or 180,000 records then
you have the wrong product. RS is not designed to generate reports that are
1500+ pages. Rendering is done in RAM so there is a direct correlation to
between number of records and the amount of RAM used. If the rendering
output is Excel or PDF then the amount of RAM consumed is even more. If the
destination is another program then there are better ways to do this. I know
that sometimes people are wanting to get a large amount of rows into Excel
for further analysis but it would be better to be using DTS and getting the
data out in CSV for them. Much much faster process.
If the output is not that many records but you have that many because you
are using filters then try to move away from filters. Filters brings over
all the data and then filters it. Use a query parameter instead.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in message
news:57A0B71F-F342-45E4-B6A7-885577B71E96@.microsoft.com...
> Hi all,
> I am running reporting services on a dedicated server with 2GB of memory,
> when ever a report is run that contains over 90,000 records the server
> crashes and restarts. We have looked at the reports, and are to reduce the
> criteria range to make smaller reports, but it then affects role the
system
> was designed to play in my organisation. If anyone has come across these
> memory crashes and come across any ways of resolving them or ways that
> haven't resolved them, including an increase in the memory size, it would
be
> greatly appreciated if you could reply. We are willing and able to throw
> another 2GB at the problem, but are concerned this may only be a short
term
> solution and the problem will reoccur when the reporting size reaches
180,000
> records.
> Any assistance you may be able to provide would be greatly appreciated|||You might try CSV and see if it takes up less memory than Excel.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Bruce L-C [MVP]" <bruce_lcNOSPAM@.hotmail.com> wrote in message
news:u5P0mDYUFHA.2124@.TK2MSFTNGP14.phx.gbl...
> If you mean that the result set has 90,000 records or 180,000 records then
> you have the wrong product. RS is not designed to generate reports that
are
> 1500+ pages. Rendering is done in RAM so there is a direct correlation to
> between number of records and the amount of RAM used. If the rendering
> output is Excel or PDF then the amount of RAM consumed is even more. If
the
> destination is another program then there are better ways to do this. I
know
> that sometimes people are wanting to get a large amount of rows into Excel
> for further analysis but it would be better to be using DTS and getting
the
> data out in CSV for them. Much much faster process.
> If the output is not that many records but you have that many because you
> are using filters then try to move away from filters. Filters brings over
> all the data and then filters it. Use a query parameter instead.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in
message
> news:57A0B71F-F342-45E4-B6A7-885577B71E96@.microsoft.com...
> > Hi all,
> >
> > I am running reporting services on a dedicated server with 2GB of
memory,
> > when ever a report is run that contains over 90,000 records the server
> > crashes and restarts. We have looked at the reports, and are to reduce
the
> > criteria range to make smaller reports, but it then affects role the
> system
> > was designed to play in my organisation. If anyone has come across
these
> > memory crashes and come across any ways of resolving them or ways that
> > haven't resolved them, including an increase in the memory size, it
would
> be
> > greatly appreciated if you could reply. We are willing and able to
throw
> > another 2GB at the problem, but are concerned this may only be a short
> term
> > solution and the problem will reoccur when the reporting size reaches
> 180,000
> > records.
> >
> > Any assistance you may be able to provide would be greatly appreciated
>|||Bruce,
Thanks for your responses.
Do you know if there is away to detect if the report is going to generate
over 200 pages and return a message to the user to say something like 'This
report may crash the server, are you sure you want to continue' ? From your
responses, and others I have read about memory, and RS limitations are the
only issue I am encountering, not programming issues, so if we confirm with
users before they run large reports it could solve our problems, unless they
select 'Yes' of course.
Any further assistance you may be able to provide would be greatly
appreciated.
"Bruce L-C [MVP]" wrote:
> You might try CSV and see if it takes up less memory than Excel.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "Bruce L-C [MVP]" <bruce_lcNOSPAM@.hotmail.com> wrote in message
> news:u5P0mDYUFHA.2124@.TK2MSFTNGP14.phx.gbl...
> > If you mean that the result set has 90,000 records or 180,000 records then
> > you have the wrong product. RS is not designed to generate reports that
> are
> > 1500+ pages. Rendering is done in RAM so there is a direct correlation to
> > between number of records and the amount of RAM used. If the rendering
> > output is Excel or PDF then the amount of RAM consumed is even more. If
> the
> > destination is another program then there are better ways to do this. I
> know
> > that sometimes people are wanting to get a large amount of rows into Excel
> > for further analysis but it would be better to be using DTS and getting
> the
> > data out in CSV for them. Much much faster process.
> >
> > If the output is not that many records but you have that many because you
> > are using filters then try to move away from filters. Filters brings over
> > all the data and then filters it. Use a query parameter instead.
> >
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> > "William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in
> message
> > news:57A0B71F-F342-45E4-B6A7-885577B71E96@.microsoft.com...
> > > Hi all,
> > >
> > > I am running reporting services on a dedicated server with 2GB of
> memory,
> > > when ever a report is run that contains over 90,000 records the server
> > > crashes and restarts. We have looked at the reports, and are to reduce
> the
> > > criteria range to make smaller reports, but it then affects role the
> > system
> > > was designed to play in my organisation. If anyone has come across
> these
> > > memory crashes and come across any ways of resolving them or ways that
> > > haven't resolved them, including an increase in the memory size, it
> would
> > be
> > > greatly appreciated if you could reply. We are willing and able to
> throw
> > > another 2GB at the problem, but are concerned this may only be a short
> > term
> > > solution and the problem will reoccur when the reporting size reaches
> > 180,000
> > > records.
> > >
> > > Any assistance you may be able to provide would be greatly appreciated
> >
> >
>
>|||What you could do is have an intermediary report that does a count and then
provides the appropriate message and link. You can hide the real report from
the user in list view so they have to go through this. Also you could use
jump to url and render to Excel.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in message
news:29D8180F-C3A4-414B-BE75-1FE9E908AA0F@.microsoft.com...
> Bruce,
> Thanks for your responses.
> Do you know if there is away to detect if the report is going to generate
> over 200 pages and return a message to the user to say something like
'This
> report may crash the server, are you sure you want to continue' ? From
your
> responses, and others I have read about memory, and RS limitations are the
> only issue I am encountering, not programming issues, so if we confirm
with
> users before they run large reports it could solve our problems, unless
they
> select 'Yes' of course.
> Any further assistance you may be able to provide would be greatly
> appreciated.
> "Bruce L-C [MVP]" wrote:
> > You might try CSV and see if it takes up less memory than Excel.
> >
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> > "Bruce L-C [MVP]" <bruce_lcNOSPAM@.hotmail.com> wrote in message
> > news:u5P0mDYUFHA.2124@.TK2MSFTNGP14.phx.gbl...
> > > If you mean that the result set has 90,000 records or 180,000 records
then
> > > you have the wrong product. RS is not designed to generate reports
that
> > are
> > > 1500+ pages. Rendering is done in RAM so there is a direct correlation
to
> > > between number of records and the amount of RAM used. If the rendering
> > > output is Excel or PDF then the amount of RAM consumed is even more.
If
> > the
> > > destination is another program then there are better ways to do this.
I
> > know
> > > that sometimes people are wanting to get a large amount of rows into
Excel
> > > for further analysis but it would be better to be using DTS and
getting
> > the
> > > data out in CSV for them. Much much faster process.
> > >
> > > If the output is not that many records but you have that many because
you
> > > are using filters then try to move away from filters. Filters brings
over
> > > all the data and then filters it. Use a query parameter instead.
> > >
> > >
> > > --
> > > Bruce Loehle-Conger
> > > MVP SQL Server Reporting Services
> > >
> > > "William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in
> > message
> > > news:57A0B71F-F342-45E4-B6A7-885577B71E96@.microsoft.com...
> > > > Hi all,
> > > >
> > > > I am running reporting services on a dedicated server with 2GB of
> > memory,
> > > > when ever a report is run that contains over 90,000 records the
server
> > > > crashes and restarts. We have looked at the reports, and are to
reduce
> > the
> > > > criteria range to make smaller reports, but it then affects role the
> > > system
> > > > was designed to play in my organisation. If anyone has come across
> > these
> > > > memory crashes and come across any ways of resolving them or ways
that
> > > > haven't resolved them, including an increase in the memory size, it
> > would
> > > be
> > > > greatly appreciated if you could reply. We are willing and able to
> > throw
> > > > another 2GB at the problem, but are concerned this may only be a
short
> > > term
> > > > solution and the problem will reoccur when the reporting size
reaches
> > > 180,000
> > > > records.
> > > >
> > > > Any assistance you may be able to provide would be greatly
appreciated
> > >
> > >
> >
> >
> >|||Thanks for the feedback Bruce I will give it a go !|||Hi There,
We are having a problem in listing subscriptions in Report manager and
getting "OutOfMemory" exception.
Our Application has an event-based subscription management system, and
every-time an event gets fired on the Application-side, it creates an
one-off subscription on the Reporting Services. So overtime the number of
Subscriptions created on the reporting services has grown, and a particular
report has about 8000+ subscriptions to date now.
So when we try to manage the subscriptions (try to delete the irrelevant
ones) in the RS "Report Manager", for that particular report, System gives
an OutOfMemory exception. I think Report Manager Calls internally
"ListSubscriptions" method (as explained in 840709) and couldn't cope up
with.
And i looked at MSDN Knowledgebase Article:840709, and increased the
"MemoryLimit" setting in RSReportServer.config, but there was no use.
By the way our Server has 2GB of RAM and we use Custom Authentication on
Reporting Services.
I posted this question here, because I thought the problem is similar to
what you were talking (Memory management Issue).
Any Suggestions are appreciated.
Regards
Raj Chidipudi
"William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in message
news:29D8180F-C3A4-414B-BE75-1FE9E908AA0F@.microsoft.com...
> Bruce,
> Thanks for your responses.
> Do you know if there is away to detect if the report is going to generate
> over 200 pages and return a message to the user to say something like
'This
> report may crash the server, are you sure you want to continue' ? From
your
> responses, and others I have read about memory, and RS limitations are the
> only issue I am encountering, not programming issues, so if we confirm
with
> users before they run large reports it could solve our problems, unless
they
> select 'Yes' of course.
> Any further assistance you may be able to provide would be greatly
> appreciated.
> "Bruce L-C [MVP]" wrote:
> > You might try CSV and see if it takes up less memory than Excel.
> >
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> > "Bruce L-C [MVP]" <bruce_lcNOSPAM@.hotmail.com> wrote in message
> > news:u5P0mDYUFHA.2124@.TK2MSFTNGP14.phx.gbl...
> > > If you mean that the result set has 90,000 records or 180,000 records
then
> > > you have the wrong product. RS is not designed to generate reports
that
> > are
> > > 1500+ pages. Rendering is done in RAM so there is a direct correlation
to
> > > between number of records and the amount of RAM used. If the rendering
> > > output is Excel or PDF then the amount of RAM consumed is even more.
If
> > the
> > > destination is another program then there are better ways to do this.
I
> > know
> > > that sometimes people are wanting to get a large amount of rows into
Excel
> > > for further analysis but it would be better to be using DTS and
getting
> > the
> > > data out in CSV for them. Much much faster process.
> > >
> > > If the output is not that many records but you have that many because
you
> > > are using filters then try to move away from filters. Filters brings
over
> > > all the data and then filters it. Use a query parameter instead.
> > >
> > >
> > > --
> > > Bruce Loehle-Conger
> > > MVP SQL Server Reporting Services
> > >
> > > "William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in
> > message
> > > news:57A0B71F-F342-45E4-B6A7-885577B71E96@.microsoft.com...
> > > > Hi all,
> > > >
> > > > I am running reporting services on a dedicated server with 2GB of
> > memory,
> > > > when ever a report is run that contains over 90,000 records the
server
> > > > crashes and restarts. We have looked at the reports, and are to
reduce
> > the
> > > > criteria range to make smaller reports, but it then affects role the
> > > system
> > > > was designed to play in my organisation. If anyone has come across
> > these
> > > > memory crashes and come across any ways of resolving them or ways
that
> > > > haven't resolved them, including an increase in the memory size, it
> > would
> > > be
> > > > greatly appreciated if you could reply. We are willing and able to
> > throw
> > > > another 2GB at the problem, but are concerned this may only be a
short
> > > term
> > > > solution and the problem will reoccur when the reporting size
reaches
> > > 180,000
> > > > records.
> > > >
> > > > Any assistance you may be able to provide would be greatly
appreciated
> > >
> > >
> >
> >
> >|||I have exact same problem. My report is about 28000 rows and it takes about
15 minutes to render and end user gets frustrated and he tries to "End Task"
the browser - every thing hangs up. I can't put drill down, etc., so that
is not my option. Is there any way I can stream data, instead of wait to
retreive all rows from SQL? - just like SQL Query analyzer window - it starts
producing results as soon as you execute the query. This is really a big
issue in our organization. Because of this, people started to hate RS. I
need some kind of solution asap. Please help.
"Bruce L-C [MVP]" wrote:
> You might try CSV and see if it takes up less memory than Excel.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "Bruce L-C [MVP]" <bruce_lcNOSPAM@.hotmail.com> wrote in message
> news:u5P0mDYUFHA.2124@.TK2MSFTNGP14.phx.gbl...
> > If you mean that the result set has 90,000 records or 180,000 records then
> > you have the wrong product. RS is not designed to generate reports that
> are
> > 1500+ pages. Rendering is done in RAM so there is a direct correlation to
> > between number of records and the amount of RAM used. If the rendering
> > output is Excel or PDF then the amount of RAM consumed is even more. If
> the
> > destination is another program then there are better ways to do this. I
> know
> > that sometimes people are wanting to get a large amount of rows into Excel
> > for further analysis but it would be better to be using DTS and getting
> the
> > data out in CSV for them. Much much faster process.
> >
> > If the output is not that many records but you have that many because you
> > are using filters then try to move away from filters. Filters brings over
> > all the data and then filters it. Use a query parameter instead.
> >
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> > "William Foster" <WilliamFoster@.discussions.microsoft.com> wrote in
> message
> > news:57A0B71F-F342-45E4-B6A7-885577B71E96@.microsoft.com...
> > > Hi all,
> > >
> > > I am running reporting services on a dedicated server with 2GB of
> memory,
> > > when ever a report is run that contains over 90,000 records the server
> > > crashes and restarts. We have looked at the reports, and are to reduce
> the
> > > criteria range to make smaller reports, but it then affects role the
> > system
> > > was designed to play in my organisation. If anyone has come across
> these
> > > memory crashes and come across any ways of resolving them or ways that
> > > haven't resolved them, including an increase in the memory size, it
> would
> > be
> > > greatly appreciated if you could reply. We are willing and able to
> throw
> > > another 2GB at the problem, but are concerned this may only be a short
> > term
> > > solution and the problem will reoccur when the reporting size reaches
> > 180,000
> > > records.
> > >
> > > Any assistance you may be able to provide would be greatly appreciated
> >
> >
>
>

No comments:

Post a Comment