1

I'm supposed to do a stress testing on my application for which I need 60 million records in a table. The approach I'm following is, first I'm initializing the instance of Connection class, then inserting the data using a for loop by PreparedStatement using that Connection, and after every 500 rows insertion I'm committing the connection and resetting it.

Using this, I'm able to insert approx 150 records per minute. I just want to know if there's any more optimized way of doing it so that time required for execution is minimal

4
  • 1
    Investigate batch inserts or load the data using the Oracle utilities instead of your code. Commented Nov 11, 2016 at 20:04
  • Committing in a loop is not going to improve performance in Oracle - the opposite is the case. Only commit once at the end of your transaction. Commented Nov 11, 2016 at 21:58
  • But 150 rows per minute is way to slow. I can get about 1 million per minute with a local server (narrow table with 6 columns) Commented Nov 11, 2016 at 22:08
  • Did you try Bulk Insertion Operation?It will insert very quickly compared with batch insert. Commented Nov 14, 2016 at 12:01

1 Answer 1

3

Use jdbc batch inserts:

PreparedStatement ps = conn.prepareStatement(sql); int count = 0; for(loop construct here) { count++; ps.setString(1,someString); // ... and so forth ps.addBatch(); if (count%500 ==0) ps.executeBatch() } ps.executeBatch(); 
Sign up to request clarification or add additional context in comments.

2 Comments

Hi! do i need to execute ps.clearBatch(); after the execute?
@MarianoL No, you shouldn't need to do that. executeBatch should flush the last batch. stackoverflow.com/questions/58522625/…

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.