4

My requirement is to upload 4GB file from .net application and Download the same file from .net application.we were able to upload and download 4GB file using .net application.

So to upload 4GB file we had splitted the 4GB file and while downloading we merged the file using FileStream objects in C#.

Now I am able to save the file in client machine when I click the Download button from my .net application.But while saving the file stream objects writes byte byte and saves the file to user's machine.It is taking more time to save file to the client machine.Even for 100MB file its takes to save approximately 2 hrs.So If I save 4GB it will take long time.

What is the best way to improve the speed.? Is there any Filestream objects available? please help me to download the 4GB file from Networkshare using .net application. if you find any other solution also to uplaod and download 4gb file in .net it would be good.

I am unable to use asp .net download code to download 4GB file.So We followed the splitting approach. Please help me to improve the speed of the below code.I am using Asp .net 3.5 application.

my code while Uploading:

 FileStream foption = new FileStream(strFileName, FileMode.Open); len = foption.Length; eachSize = (int)Math.Ceiling((double)len / x); foption.Close(); FileStream inFile = new FileStream(strFileName, FileMode.OpenOrCreate, FileAccess.Read); for (int i = 0; i < x; i++) { FileStream outFile = new FileStream(strDir + "\\" + i + ".zip", FileMode.OpenOrCreate, FileAccess.Write); int data = 0; byte[] buffer = new byte[eachSize]; if ((data = inFile.Read(buffer, 0, eachSize)) > 0) { outFile.Write(buffer, 0, data); } outFile.Close(); } 

my code while downloading

FileStream outFile = new FileStream("\\\\" + clientIPAddress + "\\upload\\output.zip", FileMode.OpenOrCreate, FileAccess.Write); for (int i = 0; i < 10; i++) { int data = 0; byte[] buffer = new byte[4096]; FileStream inFile = new FileStream(strMediaPath + "\\" + i + ".zip", FileMode.OpenOrCreate, FileAccess.Read); while ((data = inFile.Read(buffer, 0, 4096)) > 0) { outFile.Write(buffer, 0, data); } inFile.Close(); } outFile.Close(); 

Thanks Edwin

1 Answer 1

1

I don't see the point of splitting the file in the first place, all you're doing is guaranteeing another slow disk copy operation at the end of the transfer to concatenate the file.

Not splitting the file will require that i in the first code block be a long instead of int. 32-bit integers will tap out at 2GB since they are signed.

Finally, your buffer should be a LOT larger. 4k is what my first TRS-80 computer had in it, that's a tiny chunk of data to be reading and writing inside a loop. Try something more substantial, like 1MB.

Sign up to request clarification or add additional context in comments.

1 Comment

Both catches are excellent; I ran some quick and dirty tests on a hot cache on my machine and found 128kb was slightly fastest, but it was only 1.4 times faster than 4kb. Cold cache will be different, but that's beyond my level of interest at the moment. :)

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.