MyBatis批量插入幾千條數(shù)據(jù)慎用foreach

近日,項(xiàng)目中有一個(gè)耗時(shí)較長(zhǎng)的 Job 存在 CPU 占用過(guò)高的問(wèn)題。
<insert id="batchInsert" parameterType="java.util.List">
insert into USER (id, name) values
<foreach collection="list" item="model" index="index" separator=",">
(#{model.id}, #{model.name})
</foreach>
</insert>
這個(gè)方法提升批量插入速度的原理是,將傳統(tǒng)的:
INSERT INTO `table1` (`field1`, `field2`) VALUES ("data1", "data2");
INSERT INTO `table1` (`field1`, `field2`) VALUES ("data1", "data2");
INSERT INTO `table1` (`field1`, `field2`) VALUES ("data1", "data2");
INSERT INTO `table1` (`field1`, `field2`) VALUES ("data1", "data2");
INSERT INTO `table1` (`field1`, `field2`) VALUES ("data1", "data2");
轉(zhuǎn)化為:
INSERT INTO `table1` (`field1`, `field2`) VALUES ("data1", "data2"),
("data1", "data2"),
("data1", "data2"),
("data1", "data2"),
("data1", "data2");
在 MySql Docs:https://dev.mysql.com/doc/refman/5.6/en/insert-optimization.html中也提到過(guò)這個(gè) trick,如果要優(yōu)化插入速度時(shí),可以將許多小型操作組合到一個(gè)大型操作中。理想情況下,這樣可以在單個(gè)連接中一次性發(fā)送許多新行的數(shù)據(jù),并將所有索引更新和一致性檢查延遲到最后才進(jìn)行。
乍看上去這個(gè) foreach 沒(méi)有問(wèn)題,但是經(jīng)過(guò)項(xiàng)目實(shí)踐發(fā)現(xiàn),當(dāng)表的列數(shù)較多(20+),以及一次性插入的行數(shù)較多(5000+)時(shí),整個(gè)插入的耗時(shí)十分漫長(zhǎng),達(dá)到了 14 分鐘,這是不能忍的。在[資料]https://stackoverflow.com/questions/19682414/how-can-mysql-insert-millions-records-fast中也提到了一句話:
?
Of course don't combine ALL of them, if the amount is HUGE. Say you have 1000 rows you need to insert, then don't do it one at a time. You shouldn't equally try to have all 1000 rows in a single query. Instead break it into smaller sizes.
?
它強(qiáng)調(diào),當(dāng)插入數(shù)量很多時(shí),不能一次性全放在一條語(yǔ)句里??墒菫槭裁床荒芊旁谕粭l語(yǔ)句里呢?這條語(yǔ)句為什么會(huì)耗時(shí)這么久呢?我查閱了[資料]https://stackoverflow.com/questions/32649759/using-foreach-to-do-batch-insert-with-mybatis/40608353發(fā)現(xiàn):
?
「Insert inside Mybatis foreach is not batch」, this is a single (could become giant) SQL statement and that brings drawbacks:
- some database such as Oracle here does not support.
- in relevant cases: there will be a large number of records to insert and the database configured limit (by default around 2000 parameters per statement) will be hit, and eventually possibly DB stack error if the statement itself become too large.
Iteration over the collection must not be done in the mybatis XML. Just execute a simple Insertstatement in a Java Foreach loop. 「The most important thing is the session Executor type」.
SqlSession session = sessionFactory.openSession(ExecutorType.BATCH);
for (Model model : list) {
session.insert("insertStatement", model);
}
session.flushStatements();
Unlike default ExecutorType.SIMPLE, the statement will be prepared once and executed for each record to insert.
?
從[資料]https://blog.csdn.net/wlwlwlwl015/article/details/50246717中可知,默認(rèn)執(zhí)行器類型為Simple,會(huì)為每個(gè)語(yǔ)句創(chuàng)建一個(gè)新的預(yù)處理語(yǔ)句,也就是創(chuàng)建一個(gè)「PreparedStatement」對(duì)象。在我們的項(xiàng)目中,會(huì)不停地使用批量插入這個(gè)方法,而因?yàn)镸yBatis對(duì)于含有<foreach>的語(yǔ)句,無(wú)法采用緩存,那么在每次調(diào)用方法時(shí),都會(huì)重新解析sql語(yǔ)句。
?
Internally, it still generates the same single insert statement with many placeholders as the JDBC code above.
MyBatis has an ability to cache PreparedStatement, but this statement cannot be cached because it contains <foreach /> element and the statement varies depending on the parameters.
As a result, MyBatis has to 1) evaluate the foreach part and 2) parse the statement string to build parameter mapping [1] on every execution of this statement.
And these steps are relatively costly process when the statement string is big and contains many placeholders.
[1] simply put, it is a mapping between placeholders and the parameters.
?
從上述[資料] http://blog.harawata.net/2016/04/bulk-insert-multi-row-vs-batch-using.html 可知,耗時(shí)就耗在,由于我 foreach 后有 5000+ 個(gè) values,所以這個(gè)PreparedStatement 特別長(zhǎng),包含了很多占位符,對(duì)于占位符和參數(shù)的映射尤其耗時(shí)。并且,查閱相關(guān)[資料]https://www.red-gate.com/simple-talk/sql/performance/comparing-multiple-rows-insert-vs-single-row-insert-with-three-data-load-methods 可知,values 的增長(zhǎng)與所需的解析時(shí)間,是呈指數(shù)型增長(zhǎng)的。

所以,如果非要使用 foreach 的方式來(lái)進(jìn)行批量插入的話,可以考慮減少一條 insert 語(yǔ)句中 values 的個(gè)數(shù),最好能達(dá)到上面曲線的最底部的值,使速度最快。一般按[經(jīng)驗(yàn)]https://stackoverflow.com/questions/7004390/java-batch-insert-into-mysql-very-slow來(lái)說(shuō),一次性插 20~50 行數(shù)量是比較合適的,時(shí)間消耗也能接受。
重點(diǎn)來(lái)了。上面講的是,如果非要用<foreach>的方式來(lái)插入,可以提升性能的方式。而實(shí)際上,MyBatis文檔中寫批量插入的時(shí)候,是推薦使用另外一種方法。(可以看http://www.mybatis.org/mybatis-dynamic-sql/docs/insert.html中 「Batch Insert Support」 標(biāo)題里的內(nèi)容)
SqlSession session = sqlSessionFactory.openSession(ExecutorType.BATCH);
try {
SimpleTableMapper mapper = session.getMapper(SimpleTableMapper.class);
List<SimpleTableRecord> records = getRecordsToInsert(); // not shown
BatchInsert<SimpleTableRecord> batchInsert = insert(records)
.into(simpleTable)
.map(id).toProperty("id")
.map(firstName).toProperty("firstName")
.map(lastName).toProperty("lastName")
.map(birthDate).toProperty("birthDate")
.map(employed).toProperty("employed")
.map(occupation).toProperty("occupation")
.build()
.render(RenderingStrategy.MYBATIS3);
batchInsert.insertStatements().stream().forEach(mapper::insert);
session.commit();
} finally {
session.close();
}
即基本思想是將MyBatis session的executor type設(shè)為 「Batch」,然后多次執(zhí)行插入語(yǔ)句。就類似于 JDBC 的下面語(yǔ)句一樣。
Connection connection = DriverManager.getConnection("jdbc:mysql://127.0.0.1:3306/mydb?useUnicode=true&characterEncoding=UTF-8&useServerPrepStmts=false&rewriteBatchedStatements=true","root","root");
connection.setAutoCommit(false);
PreparedStatement ps = connection.prepareStatement(
"insert into tb_user (name) values(?)");
for (int i = 0; i < stuNum; i++) {
ps.setString(1,name);
ps.addBatch();
}
ps.executeBatch();
connection.commit();
connection.close();
經(jīng)過(guò)試驗(yàn),使用了ExecutorType.BATCH的插入方式,性能顯著提升,不到 2s 便能全部插入完成。
總結(jié)一下,如果 MyBatis 需要進(jìn)行批量插入,推薦使用ExecutorType.BATCH的插入方式,如果非要使用<foreach>的插入的話,需要將每次插入的記錄控制在 20~50 左右。




























