29

How can I execute the following SQL in a scalable way using JdbcTemplate running on mySQL. In this case, scalable means:

  1. Only one SQL statement is executed on the server
  2. it works for any number of rows.

Here's the statement:

INSERT INTO myTable (foo, bar) VALUES ("asdf", "asdf"), ("qwer", "qwer") 

Assume that I have a list of POJO's with foo and bar fields. I realize that I could just iterate over the list and execute:

jdbcTemplate.update("INSERT INTO myTable(foo, bar) VALUES (?, ?)", paramMap) 

but that doesn't doesn't accomplish the first criterion.

I believe I could also execute:

jdbcTemplate.batchUpdate("INSERT INTO myTable(foo, bar) VALUES (?, ?)", paramMapArray) 

but from what I can tell, that will just compile the SQL once and execute it multiple times, failing the first criterion again.

The final possibility, which seems to pass both criteria, would be to simply build the SQL myself with a StringBuffer, but I'd like to avoid that.

6
  • Can we do the same using just JDBC?? Commented Jul 2, 2010 at 12:54
  • This has nothing to do with JdbcTemplate, or even JDBC. You can't do this in SQL, period (or standard SQL, anyway), so you certainly can't do it in JdbcTemplate. Commented Jul 2, 2010 at 12:55
  • @skaffman: I've updated my question to say that I'm using mySQL. Maybe it's an mySQL-only feature, but it's described at dev.mysql.com/doc/refman/5.1/en/insert.html about a quarter of the way down: "INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas. Example: " Commented Jul 2, 2010 at 13:01
  • @Teja: Yes, it would be possible to do in pure JDBC, but that's not the question. I've updated the question with a third possibility which would be to build the SQL all by hand. Commented Jul 2, 2010 at 13:04
  • 2
    If you're using InnoDB then a batchUpdate should only update the index table after the last insert. The only efficiency gain you'd get from using a single statement is that you'd have to send less data to the MySQL server. I doubt you'll be able to do the multiple inserts with a standard JdbcTemplate but you could always extend JdbcTemplate and roll your own batch insert method which built the insert string by hand. Commented Jul 2, 2010 at 13:26

4 Answers 4

49

You can use BatchPreparedStatementSetter like below.

public void insertListOfPojos(final List<MyPojo> myPojoList) { String sql = "INSERT INTO " + "MY_TABLE " + "(FIELD_1,FIELD_2,FIELD_3) " + "VALUES " + "(?,?,?)"; getJdbcTemplate().batchUpdate(sql, new BatchPreparedStatementSetter() { @Override public void setValues(PreparedStatement ps, int i) throws SQLException { MyPojo myPojo = myPojoList.get(i); ps.setString(1, myPojo.getField1()); ps.setString(2, myPojo.getField2()); ps.setString(3, myPojo.getField3()); } @Override public int getBatchSize() { return myPojoList.size(); } }); } 
Sign up to request clarification or add additional context in comments.

2 Comments

I've got "Parameter index out of range (1 > number of parameters, which is 0)" when using :NamedParameters instead of ?. Some update for using with NamedParameterJdbcTemplate?
I think this solution doesn't meet the 1st criteria of the question. From the testing I've done, and from the following Javadoc for BatchPreparedStatementSetter, I think this approach simply makes multiple calls for the same SQL string: sets values on a PreparedStatement provided by the JdbcTemplate class, for each of a number of updates in a batch using the same SQL.. This is problematic for both performance and for the atomicity of the data. If one of the INSERT statements fails for an "out of range" Exception we'd want either all of the data to be written or none of it.
6

It looks to me that batchUpdate() method of JdbcTemplate could be helpful in this case (copied from here http://www.mkyong.com/spring/spring-jdbctemplate-batchupdate-example/):

//insert batch example public void insertBatch(final List<Customer> customers){ String sql = "INSERT INTO CUSTOMER " + "(CUST_ID, NAME, AGE) VALUES (?, ?, ?)"; getJdbcTemplate().batchUpdate(sql, new BatchPreparedStatementSetter() { @Override public void setValues(PreparedStatement ps, int i) throws SQLException { Customer customer = customers.get(i); ps.setLong(1, customer.getCustId()); ps.setString(2, customer.getName()); ps.setInt(3, customer.getAge() ); } @Override public int getBatchSize() { return customers.size(); } }); } 

1 Comment

Thanks a lot, this is exactly what I was looking for!!
5

Multirow inserts (using "row value constructors") are in fact part of the SQL-92 standard. See http://en.wikipedia.org/wiki/Insert_(SQL)#Multirow_inserts.

Some databases do not support this syntax, but many do. In my experience Derby/Cloudscape, DB2, Postgresql and the newer Hypersonic 2.*+ releases do support this.

Your concern about getting this to work as a PreparedStatement is understandable, but I've seen similar cases where Spring JDBC does automatically handle a Collection of items for certain queries (like where in (?)), but I cannot vouch for this case.

I did find some possibly helpful information at (can't add second link to this post) which might be of some help.

I can tell you that its probably not possible for your second requirement (works for any number of arguments) to be met in the most strict sense: every database I've used does impose query length limitations that would come into play.

1 Comment

that URL that SO did not allow my to post above was: fusesource.com/docs/router/2.2/transactions/…
-1

you can also try with jdbcInsert.executeBatch(sqlParamSourceArray)

 // define parameters jdbcInsert = new SimpleJdbcInsert(jdbcTemplate); jdbcInsert.withTableName("TABlE_NAME"); SqlParameterSource[] sqlParamSourceArray = new SqlParameterSource[apiConsumer .getApiRoleIds().size()]; for (int i = 0; i < myCollection.size(); i++) { sqlParamSourceArray[i] = new MapSqlParameterSource().addValue("COL1"); ...................... } // execute insert int[] keys = jdbcInsert.executeBatch(sqlParamSourceArray); 

1 Comment

This answer is false! jdbcInsert.executeBatch() does NOT return the keys. It returns the array of number of rows affected as returned by the JDBC driver. See the javadocs. docs.spring.io/spring-framework/docs/current/javadoc-api/org/…

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.