2

I have a criteria page in my asp.net application. When user clicks report button, firstly in a new page results are binded to a datagrid, then this page is exported to excel file with changing content type method.

That normally works, but when large amount of data comes, system.outofmemoryexception is thrown.

Does anyone know a way to fix this problem, or another usefull technic to do?

2 Answers 2

7

This is probably happening because your application is trying to build the entire Excel spreadsheet in memory and then deliver it to the user once that is complete. For large datasets, you can easily use up all of the available memory, which is most likely also causing the application to dramatically slow down for the rest of the users.

Instead, you can try streaming the data to the user. This will use the same amount of memory, regardless of how large your dataset is. You can gets the bytes of the Excel spreadsheet, or simply convert your data to a CSV, and then set the HTTP Response Type and stream it to the user.

Here is an example:

byte[] reportDoc = GetExportExcel(); context.Response.ContentType = "application/vnd.ms-excel"; //set the content disposition header to force download context.Response.AddHeader("Content-Disposition", "attachment;filename=" + "Export.xls"); //write the file content byte array context.Response.BinaryWrite(reportDoc); 

There is a detailed tutorial at http://bytes.com/topic/asp-net/answers/326796-example-streaming-excel-browser-download

Sign up to request clarification or add additional context in comments.

Comments

0

Excel also has a maximum number of records, which I believe to be around 65k, though may not be a direct result of your problem, but might still come up if you're getting an error while working with them.

1 Comment

The limit is per sheet, and is only an issue with the old XLS filetype.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.