I found this question while trying to upload a large csv to my remote database. Based on what I experienced, the two are not the same. The first method, with batching is the safer solution, but I also found it to be much slower.
I recognize the that this is not the way that is recommended for implementing something that accepts user inputs, but for my use case, it was the first tractable solution that I found.
My use case was this: I had a civ with 21 million rows that I wanted to upload to my database, and I found that using prepared statement batches was much slower. As far as I can tell, this is because each insert statement is added separately to the batch, and so having batch of 1000 executes 1000 insert statements. Whatever the case, this was taking about 30 seconds per batch for me with batch sizes of 1000. With 21 million rows, that would have taken two days. Therefore, I deliberately did something UNSAFE that worked much faster.
When I did it this way, each batch of 10,000 took about 0.25 seconds. This ran enough (1000x) faster than the other way that I thought I would share it for people that are looking for the same answer I was.
For reference, the CSV file that I was using was downloaded from https://fdc.nal.usda.gov/download-datasets.html.
The readCSV function was from Jeronimo Backes's solution here: Slow CSV row parsing and splitting. https://mvnrepository.com/artifact/com.univocity/univocity-parsers
Again, don't do something like this for a solution where you are worried about injection attacks. I used this for a really fast way of reading a csv and putting it into a mysql database.
public static void writeFileToCSV(String filename) { PreparedStatement stmt = null; Connection con = null; Integer insertCount = 0; Integer batchSize = 10000; Integer numBatchesSent = 0; IterableResult<String[], ParsingContext> rows = readCSV(filename); try { con = mySqlDatabase.getConnection(); con.setAutoCommit(true); //I found that this did not significantly change the speed of the queries because there was not one query per row. Integer numInBatch = 0; ArrayList<Object> values = new ArrayList<>(); String sqlStart = "INSERT IGNORE INTO food_nutrient(id, fdc_id, nutrient_id, amount) VALUES "; String sqlEnd = " ;"; StringBuilder sqlBuilder = new StringBuilder(); StringBuilder valuesBuilder = new StringBuilder(); Integer lineNum = 0; //This is my manual parsing of the csv columns. There may be a slicker way to do this for (String[] nextLine : rows) { if (lineNum == 0) { //This ignores the header row of the csv. lineNum++; continue; } Integer id = Integer.parseInt(nextLine[0]); Integer fdc_id = Integer.parseInt(nextLine[1]); Integer nutrient_id = Integer.parseInt(nextLine[2]); Double amount = parseDouble(nextLine[3]); if (valuesBuilder.length() > 0) { valuesBuilder.append(','); } valuesBuilder.append("("); valuesBuilder.append(id); valuesBuilder.append(','); if (fdc_id != null) { valuesBuilder.append(fdc_id + ','); valuesBuilder.append(','); } else { valuesBuilder.append("NULL,"); } if (nutrient_id != null) { valuesBuilder.append(nutrient_id); valuesBuilder.append(','); } else { valuesBuilder.append("NULL,"); } if (amount != null) { valuesBuilder.append(amount); valuesBuilder.append(')'); } else { valuesBuilder.append("NULL)"); } if (++insertCount % batchSize == 0) { sqlBuilder.append(sqlStart); sqlBuilder.append(valuesBuilder); sqlBuilder.append(sqlEnd); stmt = con.prepareStatement(sqlBuilder.toString()); sqlBuilder = new StringBuilder(); valuesBuilder = new StringBuilder(); stmt.executeUpdate(); numBatchesSent++; System.out.println("Sent batch "+ numBatchesSent + " with " + batchSize + " new rows."); } //: send the batch }//: For each row in the csv //Send the values that were in the last batch. sqlBuilder.append(sqlStart); sqlBuilder.append(valuesBuilder); sqlBuilder.append(sqlEnd); stmt = con.prepareStatement(sqlBuilder.toString()); sqlBuilder = new StringBuilder(); valuesBuilder = new StringBuilder(); stmt.executeUpdate(); } catch (SQLException ex) { System.out.println(ex.getMessage()); ex.printStackTrace(); } catch (DataAccessException ex) { ex.printStackTrace(); } finally { try { System.out.println(stmt); stmt.executeBatch(); } catch (Exception ex) { ex.printStackTrace(); //Push the remainder through to the database. } try { if (stmt != null) { stmt.close(); } if (con != null) { con.close(); } rows.getContext().stop(); } catch (SQLException e) { e.printStackTrace(); } } } public static IterableResult<String[], ParsingContext> readCSV(String filePath) { File file = new File(filePath); //configure the parser here. By default all values are trimmed CsvParserSettings parserSettings = new CsvParserSettings(); //create the parser CsvParser parser = new CsvParser(parserSettings); //create an iterable over rows. This will not load everything into memory. IterableResult<String[], ParsingContext> rows = parser.iterate(file); return rows; } //This just takes care of NumberFormatExceptions that I had been getting. private static Double parseDouble(String str) { try { Double value=Double.parseDouble(str); return value; } catch (NumberFormatException ex) { System.out.println("There was probably a null value"); return null; } }