0

I have to copy more than 100k records in the same table and after every row insert in first table, second table will be updated with first table insert ID (Primary key). I tried to do bulk insert but then I would not get all the inserted ids which will be inserted in second table. I am using MySQL 5.5. When I run the following code, I get following random errors:

  1. Lost connection to MySQL server during query.
  2. The transaction associated with the current connection has completed but has not been disposed. The transaction must be disposed before the connection can be used to execute SQL statements.
  3. Net packets out of order: received[x], expected[y].

How can I insert these records optimally?

CODE;

foreach (var item in transactions) { int transactionId; using (MySqlCommand cm = DM.ConnectionManager.Conn.CreateCommand()) { cm.CommandType = System.Data.CommandType.Text; var commandText = @"INSERT INTO FirstTable SET column1=@column1;"; cm.CommandText = commandText; cm.Parameters.AddWithValue("@column1", item.column1); cm.ExecuteNonQuery(); transactionId = (int)cm.InsertId; } foreach (var item in item.TransactionDetails) { using (MySqlCommand cm = DM.ConnectionManager.Conn.CreateCommand()) { cm.CommandType = System.Data.CommandType.Text; var commandText = @"INSERT INTO SecondTable SET column1=@column1, column2=@column2;"; cm.CommandText = commandText; cm.Parameters.AddWithValue("@column1", item.column1); cm.Parameters.AddWithValue("@column2", item.column2); cm.ExecuteNonQuery(); } } } 
9
  • 3
    dang, right up my alley. Only problem, you haven't accepted any answers from yesteryear. So why bother Commented Jul 2, 2016 at 18:50
  • 1
    does MySql support bulk inserting data via XML.. if you have a datatable convert the datatable to XML and create a stored procedure that will take the XML as input and create sql stored proc code to handle the xml I do this all the time with SQL Server not sure if the same works in MySql do a good search on C# Stackoverflow Bulk Insert XML into MySql Database Commented Jul 2, 2016 at 19:57
  • @MethodMan: Yes, MySql does support bulk insert but i need to update the other table after every row update in first table, so can not use bulk. And there are lot other operations going on with the above code, which are in TrasactionScope. But still i will try your suggested method. Commented Jul 3, 2016 at 10:20
  • this can still be done if you were to store the xml in a temp table and create extra SQL within the first stored procedure to update using a CASE WHEN statement..but I'd need to know more information in regards to what you are truly trying to update. Commented Jul 3, 2016 at 16:32
  • 1
    I was just being a jerk. Nevermind me. I have a block of code I can share if pressed. But the details of this question need a little more info .. @AlexanderMP . And as MethodMan said, this needs to be done serverside. It would be screaming fast. Commented Jul 3, 2016 at 19:56

1 Answer 1

1

The fastest method to import to MySQL is INFILE, so I would suggest making a CSV file and then running the following as a SQL statement. Please note I've not got a full C# setup and tested this... but it is how I backup / restore MySQL when I want it done fast... so I'm assuming the following can be run when set to "commandText" and run after the CSV file is created and written to disk.

LOAD DATA LOCAL INFILE 'import.csv' INTO TABLE MyTable FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' (Col1,Col2,Col3); 

From https://dev.mysql.com/doc/refman/5.7/en/load-data.html "The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed..."

Sign up to request clarification or add additional context in comments.

7 Comments

i have few doubts, 1. do I need to increase the timeout time of the MySQL connection or my c# transaction scope connection? 2. there are lot other query are executing in same c# transaction scope, Is t possible to read and insert in the same c# transaction scope?
Hi, I've seen that you dont' accept answers so with respect I'll leave my answer "as is" and repeat that it is my honest best recommendation for most efficient import into MySQL. Be Well :)
sorry, @Barry i am newbie to these processes.
in terms of your questions, I'm not sure about both... the timeout would depend on the size and duration it takes to import... the ability to read and write within the same transaction would be expected... this is just a MySQL command and MySQL commands are designed to be wrapped in transactions... but I've done neither of the above... I use this from PHP not C#... I just know that this is... by far... the fasted import to MySQL so I would advise to find a way to work with it
@Berry, thanks, it worked really well, now i am able to insert more than 100k records in just 1.5 to 2 seconds which was taking more than 40-45 seconds.
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.