I'm using a chunk step with a reader and writer. I am reading data from Hive with 50000 chunk size and insert into mysql with same 50000 commit.
@Bean public JdbcBatchItemWriter<Identity> writer(DataSource mysqlDataSource) { return new JdbcBatchItemWriterBuilder<Identity>() .itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>()) .sql(insertSql) .dataSource(mysqlDataSource) .build(); } When i have started dataload and insert into Mysql its commiting very slow and 100000 records are takiing more than a hr to load while same loader with Gemfire loading 5 million recordsin 30 min.
seems like it insert one by one insted of batch as laoding 1500 then 4000 then ....etc ,does anyone faced same issue ?
insertSqlthat you are usingseems like it insert one by one insted of batch: TheJdbcBatchItemWriterdoes not inset items one by one, it inserts them with a JDBC batch update in a single transaction: github.com/spring-projects/spring-batch/blob/…. Similar to the answer by @Binu, try to write a customSqlParameterSourceProviderthat does not use reflection and see if it improves performance.