4

I've been trying to track the progress of a file upload but keep on ending up at dead ends (uploading from a C# application not a webpage).

I tried using the WebClient as such:

class Program { static volatile bool busy = true; static void Main(string[] args) { WebClient client = new WebClient(); // Add some custom header information client.Credentials = new NetworkCredential("username", "password"); client.UploadProgressChanged += client_UploadProgressChanged; client.UploadFileCompleted += client_UploadFileCompleted; client.UploadFileAsync(new Uri("http://uploaduri/"), "filename"); while (busy) { Thread.Sleep(100); } Console.WriteLine("Done: press enter to exit"); Console.ReadLine(); } static void client_UploadFileCompleted(object sender, UploadFileCompletedEventArgs e) { busy = false; } static void client_UploadProgressChanged(object sender, UploadProgressChangedEventArgs e) { Console.WriteLine("Completed {0} of {1} bytes", e.BytesSent, e.TotalBytesToSend); } } 

The file does upload and progress is printed out but the progress is much faster than the actual upload and when uploading a large file the progress will reach the maximum within a few seconds but the actual upload takes a few minutes (it is not just waiting on a response, all the data have not yet arrived at the server).

So I tried using HttpWebRequest to stream the data instead (I know this is not the exact equivalent of a file upload as it does not produce multipart/form-data content but it does serve to illustrate my problem). I set AllowWriteStreamBuffering = false and set the ContentLength as suggested by this question/answer:

class Program { static void Main(string[] args) { FileInfo fileInfo = new FileInfo(args[0]); HttpWebRequest client = (HttpWebRequest)WebRequest.Create(new Uri("http://uploadUri/")); // Add some custom header info client.Credentials = new NetworkCredential("username", "password"); client.AllowWriteStreamBuffering = false; client.ContentLength = fileInfo.Length; client.Method = "POST"; long fileSize = fileInfo.Length; using (FileStream stream = fileInfo.OpenRead()) { using (Stream uploadStream = client.GetRequestStream()) { long totalWritten = 0; byte[] buffer = new byte[3000]; int bytesRead = 0; while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0) { uploadStream.Write(buffer, 0, bytesRead); uploadStream.Flush(); Console.WriteLine("{0} of {1} written", totalWritten += bytesRead, fileSize); } } } Console.WriteLine("Done: press enter to exit"); Console.ReadLine(); } } 

The request does not start until the entire file have been written to the stream and already shows full progress at the time it starts (I'm using fiddler to verify this). I also tried setting SendChunked to true (with and without setting the ContentLength as well). It seems like the data still gets cached before being sent over the network.

Is there something wrong with one of these approaches or is there perhaps another way I can track the progress of file uploads from a windows application?

2
  • 1
    I'm in the EXACT same boat as you. I learned that you can set client.AllowReadStreamBuffering = false; and it won't buffer the bytes right away. Its better, but still not totally ideal. Commented Apr 21, 2010 at 15:33
  • In my case I can set AllowReadStreamBuffering = false and the request will not started before all the data is 'cached' locally. Commented Apr 22, 2010 at 5:21

2 Answers 2

3

Updated:

This console app works for me as expected:

static ManualResetEvent done = new ManualResetEvent(false); static void Main(string[] args) { WebClient client = new WebClient(); client.UploadProgressChanged += new UploadProgressChangedEventHandler(client_UploadProgressChanged); client.UploadFileCompleted += new UploadFileCompletedEventHandler(client_UploadFileCompleted); client.UploadFileAsync(new Uri("http://localhost/upload"), "C:\\test.zip"); done.WaitOne(); Console.WriteLine("Done"); } static void client_UploadFileCompleted(object sender, UploadFileCompletedEventArgs e) { done.Set(); } static void client_UploadProgressChanged(object sender, UploadProgressChangedEventArgs e) { Console.Write("\rUploading: {0}% {1} of {2}", e.ProgressPercentage, e.BytesSent, e.TotalBytesToSend); } 
Sign up to request clarification or add additional context in comments.

3 Comments

Send a large file and verify the results with fiddler. I do get progress updates (just like in my example) but the progress complete much faster than the upload of the file complete. My issue is not that I do not receive progress updates, but that the progress is incorrect.
My file was about 25MB. Did you verify that the file is uploaded correctly? If you give it a non-existent URL it will give you some progress and fail around 50% (no file uploaded). However if you use the UploadFile it will throw an exception.
File is uploaded and exist server side. No exception thrown. Tested it on several different machine setups.
1

I believe your request is going over the network. I've discovered Fiddler 2.3.4.4 does not show partial requests but the MS Network Monitor can show the individual packets however not on the localhost loopback (so server and client need to be on different machines if you wish to verify).

I'm running into the same hidden buffering issue here and believe that one of the WCF service settings is not set up correctly on the server for streaming. I'm curious what type of web service you are implementing, bindings, etc. Essentially the server buffers the entire message then hands it off for processing which is why one might see a large delay after the last byte is sent from the client.

For a case I was looking at where the web service was a WCF REST service the file was being buffered at the following location before being passed as a stream argument to a web service method:

C:\Windows\Microsoft.NET\Framework\v4.0.30319\Temporary ASP.NET Files\root\86e02ad6\c1702d08\uploads*.post 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.