0

I have a data pipeline which parses, cleans and creates a data file with a few thousand rows. I need to move this data into mySQL into different tables. New data comes in every hour and my pipeline generates a new data file. Currently I am inserting/updating mySQL tables row by row iterating the data file.

I wanted to ask, is there a more efficient way to insert this data in mySQL?

0

1 Answer 1

0

I'd suggest one of the following approach

  1. While parsing, do not insert data in the table, create a bulk query that will inert batches of data and execute it every X rows (depending on your pipeline size)

    INSERT INTO table (id, x) VALUES (id1, x1), (id2, x2)...

  2. Dump your data into CSV and import resulting CSV file using LOAD DATA INFILE query

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.