I'm using a PreparedStatement in a simple java console application to load Huge amount of data out of an InputStream.
This is the code:
public void readStopTimes(CSVReader reader) throws IOException, SQLException { String insertSql = "INSERT INTO stop_times VALUES (null, ?, ?, ?, ?, ?)"; PreparedStatement statement = db.prepareStatement(insertSql); String [] nextLine; long i = 0; Chronometer chronometer = new Chronometer(); while ((nextLine = reader.readNext()) != null) { if(i++ != 0) { statement.setString(1, nextLine[0]); if(nextLine[1].isEmpty()) statement.setNull(2, Types.TIME); else statement.setTime(2, Time.valueOf(nextLine[1])); if(nextLine[2].isEmpty()) statement.setNull(3, Types.TIME); else statement.setTime(3, Time.valueOf(nextLine[2])); statement.setString(4, nextLine[3]); statement.setInt(5, Integer.parseInt(nextLine[4])); statement.addBatch(); } if(i++ % 1000 == 0) { statement.executeBatch(); } if(chronometer.count() > 5000) { chronometer.restart(); log.debug("Analyzed {} rows", i); } } statement.executeBatch(); db.commit(); } Every 1000 insertions i'm executing the batch, every 5 seconds i'm printing a log.
From the logs it's evident that this algorithm runs extremely fast at the beginning, counting a total of more than 4 million rows in the first 25 seconds, then it slow down, at the point that in 5 seconds only 2 rows gets added to the batch.
I need to insert more than 5 million rows, do you have a faster alternative?
statement = db.prepareStatement(insertSql);afterstatement.executeBatch();?