0

I have a class which reads data from port through TcpClient. I've created 2 threads.

  1. Packet Reading Thread
  2. Packet Processing Thread

1 Thread continuously reads bytes and once it gets a proper packet it adds that packet in a Queue and initiates the 2nd thread, which then reads packet from that Queue and processes it.

The packet processing should be done sequentially. Therefore each time when the reading thread adds the packet in Queue, it checks that whether that Processing Thread is running or NOT. If NOT then it again starts that thread. This is because we don't want multiple packet processing threads running simultaneously.

Now my question is actually related to that checking part. What I've done is that, I've added a simple bool variable in my class. And when I get into the packet processing thread I set that bool = true. While packet reading thread checks this bool variable, if it is true then it knows that the processing thread is running. And when packet processing thread reaches at end, it sets that bool = false.

Is this the correct way? Or is this approach prone to race condition?

Thanks.

Update (a little detail):

delegate void PacketProcessDelegate(); PacketProcessDelegate PacketProcess = new PacketProcessDelegate(this.PacketProcessingThread); PacketReadingThread() { Packet = GetPacket(); // thread blocks until packet is NOT received Queue.Synchronized(this.Queue).Enque(Packet); if (!this.IsProcessing) this.PacketProcess.BeginInvoke(null, null); } PacketProcessingThread() { this.IsProcessing=true; while(true) { Queue syncQueue; syncQueueu = Queue.Synchronized(this.Queue); if(syncQueueu.Count > 0) //packet extraction and processing code here else break; } this.IsProcessing=false; } 

Goals:
Packet Processing thread either
*) should be terminated and should be restarted, but ThreadPool should be utilized in order to prevent creation of a new thread each time. Thats why I've used the BeginInvoke.
OR
*) it should be blocked somehow until another packet is NOT received.

5
  • Any reason you can't simply use Thread.IsAlive property? Commented Aug 10, 2011 at 6:04
  • Is there something that requires processing thread to be ended after processing one packet? If no, you can just run the processing thread continually, and if queue is empty it should wait. Commented Aug 10, 2011 at 6:07
  • @drharris: Actually new thread gets invoked using BeginInvoke() method. Its NOT actually a Thread variable. Commented Aug 10, 2011 at 6:09
  • BeginInvoke() does not start a new thread. It invokes an action on the dispatcher or UI thread. Commented Aug 10, 2011 at 6:13
  • Actually, forgot about Delegate.BeginInvoke() which does start a ThreadPool thread. Still, probably not the best practice if you need to track thread lifetime. Better to use a Thread in that case, and consider using BlockingCollection<T> as both Jon and myself recommended. Commented Aug 10, 2011 at 6:20

4 Answers 4

2

You've definitely got a race condition, without any more locking than has been described. If two packets come in in quick succession, you may see that the flag is false twice and start two different threads. You could avoid this by making the reading thread set the flag to false when it's starting the thread - so the flag means "a thread is running or starting".

It's not clear why you need to keep starting threads anyway - why not just have one processing thread which blocks waiting for new packets when the queue is empty? If you're using .NET 4 this is made really easy using BlockingCollection<T>.

EDIT: If you don't need the packets to be processed in a particular order, you could just start a new thread pool work item for each packet:

ThreadPool.QueueUserWorkItem(ProcessPacket, packet); ... private void ProcessPacket(object state) { Packet packet = (Packet) state; ... } 
Sign up to request clarification or add additional context in comments.

2 Comments

I haven't yet used BlockingCollection<T> and trying to make solution run-able on .net 2.0. Actually packet processing thread processes packet very quickly. So I want this thread either to be terminated on finish or it should be blocked until new packets are available in Queue. But I don't know how to block the thread in this way.
@sallushan: Well, you can write your own producer consumer queue reasonably easily. I wrote a simple implementation years ago (before C# 2) at yoda.arachsys.com/csharp/threads/deadlocks.shtml - searching for "producer consumer queue .net" should find you lots of examples. I don't recommend starting a new thread each time. Another option would be to use the thread pool of course.
2

You may want to look into using a BlockingCollection<T> class for this. Your processing thread will block waiting for the next item to be enqueued.

Some suggested reading: Threading in C#.

Comments

0

Another design is to use wait handle instead and wait till it's signaled before calling processing - might be better CPU-wise.

1 Comment

I suspect "atomic" doesn't mean what you think it means. In particular, setting a bool variable from one thread doesn't mean it will be seen from another thread unless you set up specific memory fences. See blogs.msdn.com/b/ericlippert/archive/2011/06/16/…
0

First of all you should avoid using "Shared Resssources" this means, Data , Properties,.. which are used on multiple threads.

If you use them you should synchronize them (s. ReaderWriterLockSlim)

Also there are some Extensions Classes called TPL which make it easier to use threading. See for eaxample this link.

If you use the TPL and there for the Task Class you can use for example Wait Method for an accomplished task.

-> Task myTask = Task.Factory.StartNew(() => Foo());

 myTask.Wait(); 

I'd always reccomment using the TPL or at least Threadpool whend doing multithreaded stuff.

Also you can think about using RX which has some kind of unified programming Model on Thread and linq

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.