1

I am having a challenge of maintaining an incredibly large transaction using Nhibernate. So, let us say, I am saving large number of entities. If I do not flush on a transaction N, let us say 10000, then the performance gets killed due to overcrowded Nh Session. If I do flush, I place locks on DB level which in combination with read committed isolation level do affect working application. Also note that in reality I import an entity whose business logic is one of the hearts of the system and on its import around 10 tables are affected. That makes Stateless session a bad idea due to manual maintaining of cascades.

Moving BL to stored procedure is a big challenge due to to reasons:

  1. there is already complicated OO business logic in the domain classes of application,
  2. duplicated BL will be introduced.

Ideally I would want to Flush session to some file and only then preparation of data is completed, I would like to execute its contents. Is it possible?

Any other suggestions/best practices are more than welcome.

1 Answer 1

1

You scenario is a typical ORM batch problem. In general we can say that no ORM is meant to be used for stuff like that. If you want to have high batch processing performance (not everlasting locks and maybe deadlocks) you should not use the ORM to insert 1000s of records.

Instead use native batch inserts which will always be a lot faster. (like SqlBulkCopy for MMSQL)

Anyways, if you want to use nhibernate for this, try to make use of the batch size setting. Call save or update to all your objects and only call session.Flush once at the end. This will create all your objects in memory...

Depending on the batch size, nhibernate should try to create insert/update batches with this size, meaning you will have lot less roundtrips to the database and therefore fewer locks or at least it shouldn't take that long...

In general, your operations should only lock the database the moment your first insert statement gets executed on the server if you use normal transactions. It might work differently if you work with TransactionScope.

Here are some additional reads of how to improve batch processing.

http://fabiomaulo.blogspot.de/2011/03/nhibernate-32-batching-improvement.html NHibernate performance insert http://zvolkov.com/clog/2010/07/16?s=Insert+or+Update+records+in+bulk+with+NHibernate+batching

Sign up to request clarification or add additional context in comments.

4 Comments

The problem is that I cannot afford not to flush, and the bottleneck are not roundtrips. The problem is I have to flush since otherwise performance will be killed but as soon as I flush I have thousands of locks in DB
If you flush after each object created, roundtrips is not the bottleneck in terms of performance, but you'll get locks the moment you insert something. And while the transaction is open, the lock will not be released. Meaning the more objects you pool before you do the first "real" insert, the better for your app... The more roundtrips you have the longer the whole process takes... As I said, you have 2 options, either batch process your data, or use native batch processing from SQL Server if you use SQL Server
I do flush each N entities (5000 let us say), otherwise there are too many objects attached to session and it is too slow. First flush results in SQL being executed. Also note that I have batching already implemented and in place and it does not help
Ok if you do that already and still not working, your best option would be to use SqlBulkCopy as I said msdn.microsoft.com/en-us/library/… which also works somehow with nhibernate transaction... stackoverflow.com/questions/2006024/…

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.