Is there any change that a multiple Background Workers perform better than Tasks on 5 second running processes? I remember reading in a book that a Task is designed for short running processes.
The reasong I ask is this:
I have a process that takes 5 seconds to complete, and there are 4000 processes to complete. At first I did:
for (int i=0; i<4000; i++) { Task.Factory.StartNewTask(action); } and this had a poor performance (after the first minute, 3-4 tasks where completed, and the console application had 35 threads). Maybe this was stupid, but I thought that the thread pool will handle this kind of situation (it will put all actions in a queue, and when a thread is free, it will take an action and execute it).
The second step now was to do manually Environment.ProcessorCount background workers, and all the actions to be placed in a ConcurentQueue. So the code would look something like this:
var workers = new List<BackgroundWorker>(); //initialize workers workers.ForEach((bk) => { bk.DoWork += (s, e) => { while (toDoActions.Count > 0) { Action a; if (toDoActions.TryDequeue(out a)) { a(); } } } bk.RunWorkerAsync(); }); This performed way better. It performed much better then the tasks even when I had 30 background workers (as much tasks as in the first case).
LE:
I start the Tasks like this:
public static Task IndexFile(string file) { Action<object> indexAction = new Action<object>((f) => { Index((string)f); }); return Task.Factory.StartNew(indexAction, file); } And the Index method is this one:
private static void Index(string file) { AudioDetectionServiceReference.AudioDetectionServiceClient client = new AudioDetectionServiceReference.AudioDetectionServiceClient(); client.IndexCompleted += (s, e) => { if (e.Error != null) { if (FileError != null) { FileError(client, new FileIndexErrorEventArgs((string)e.UserState, e.Error)); } } else { if (FileIndexed != null) { FileIndexed(client, new FileIndexedEventArgs((string)e.UserState)); } } }; using (IAudio proxy = new BassProxy()) { List<int> max = new List<int>(); if (proxy.ReadFFTData(file, out max)) { while (max.Count > 0 && max.First() == 0) { max.RemoveAt(0); } while (max.Count > 0 && max.Last() == 0) { max.RemoveAt(max.Count - 1); } client.IndexAsync(max.ToArray(), file, file); } else { throw new CouldNotIndexException(file, "The audio proxy did not return any data for this file."); } } } This methods reads from an mp3 file some data, using the Bass.net library. Then that data is sent to a WCF service, using the async method. The IndexFile(string file) method, which creates tasks is called for 4000 times in a for loop. Those two events, FileIndexed and FileError are not handled, so they are never thrown.
BlockingCollectionrather thanConcurrentQueue(it will use aConcurrentQueueinternally). It will make the code a bit cleaner and easier to use.Parallel.Invokewith an array of actions?Tasks.