Skip to main content
added 329 characters in body
Source Link
Jon Skeet
  • 1.5m
  • 893
  • 9.3k
  • 9.3k

You've definitely got a race condition, without any more locking than has been described. If two packets come in in quick succession, you may see that the flag is false twice and start two different threads. You could avoid this by making the reading thread set the flag to false when it's starting the thread - so the flag means "a thread is running or starting".

It's not clear why you need to keep starting threads anyway - why not just have one processing thread which blocks waiting for new packets when the queue is empty? If you're using .NET 4 this is made really easy using BlockingCollection<T>.

EDIT: If you don't need the packets to be processed in a particular order, you could just start a new thread pool work item for each packet:

ThreadPool.QueueUserWorkItem(ProcessPacket, packet); ... private void ProcessPacket(object state) { Packet packet = (Packet) state; ... } 

You've definitely got a race condition, without any more locking than has been described. If two packets come in in quick succession, you may see that the flag is false twice and start two different threads. You could avoid this by making the reading thread set the flag to false when it's starting the thread - so the flag means "a thread is running or starting".

It's not clear why you need to keep starting threads anyway - why not just have one processing thread which blocks waiting for new packets when the queue is empty? If you're using .NET 4 this is made really easy using BlockingCollection<T>.

You've definitely got a race condition, without any more locking than has been described. If two packets come in in quick succession, you may see that the flag is false twice and start two different threads. You could avoid this by making the reading thread set the flag to false when it's starting the thread - so the flag means "a thread is running or starting".

It's not clear why you need to keep starting threads anyway - why not just have one processing thread which blocks waiting for new packets when the queue is empty? If you're using .NET 4 this is made really easy using BlockingCollection<T>.

EDIT: If you don't need the packets to be processed in a particular order, you could just start a new thread pool work item for each packet:

ThreadPool.QueueUserWorkItem(ProcessPacket, packet); ... private void ProcessPacket(object state) { Packet packet = (Packet) state; ... } 
Source Link
Jon Skeet
  • 1.5m
  • 893
  • 9.3k
  • 9.3k

You've definitely got a race condition, without any more locking than has been described. If two packets come in in quick succession, you may see that the flag is false twice and start two different threads. You could avoid this by making the reading thread set the flag to false when it's starting the thread - so the flag means "a thread is running or starting".

It's not clear why you need to keep starting threads anyway - why not just have one processing thread which blocks waiting for new packets when the queue is empty? If you're using .NET 4 this is made really easy using BlockingCollection<T>.