java JDBC插入多行

pkln4tw6  于 2023-05-05  发布在  Java
关注(0)|答案(7)|浏览(164)

我现在使用batch:

String query = "INSERT INTO table (id, name, value) VALUES (?, ?, ?)";
PreparedStatement ps = connection.prepareStatement(query);            
for (Record record : records) {
    ps.setInt(1, record.id);
    ps.setString(2, record.name);
    ps.setInt(3, record.value);
    ps.addBatch();
}
ps.executeBatch();

我只是想知道上面的代码是否等同于下面的代码。如果不是,哪个更快?

String query = "INSERT INTO table (id, name, value) VALUES ";
for (Record record : records) {
    query += "(" + record.id + ",'" + record.name + "'," + record.value + "),";
}
query = query.substring(1, query.length() - 1);
PreparedStatement ps = connection.prepareStatement(query);
ps.executeUpdate();
9rygscc1

9rygscc11#

关闭自动提交

只要将autocommit设置为false,executeBatch的性能就会优于executeUpdate

connection.setAutoCommit(false);  
PreparedStatement ps = connection.prepareStatement(query);            
for (Record record : records) {
    // etc.
    ps.addBatch();
}
ps.executeBatch();
connection.commit();
vom3gejh

vom3gejh2#

首先,使用查询字符串串联,您不仅会丢失PreparedStatement方法的本地类型转换,而且还容易受到数据库中执行的恶意代码的攻击。
第二,PreparedStatements之前被缓存在数据库本身中,这已经比普通语句提供了非常好的性能改进。

tjjdgumg

tjjdgumg3#

如果要插入的项数很大,则可能会面临严重的性能问题。因此,定义批处理大小并在达到批处理大小时持续执行查询会更加安全。
类似下面的示例代码应该可以工作。有关如何有效使用此代码的完整故事,请参阅此link

private static void insertList2DB(List<String> list) {
        final int batchSize = 1000; //Batch size is important.
        Connection conn = getConnection();
        PreparedStatement ps = null;
        try {
            String sql = "INSERT INTO theTable (aColumn) VALUES (?)";
            ps = conn.prepareStatement(sql);

            int insertCount=0;
            for (String item : list) {
                ps.setString(1, item);
                ps.addBatch();
                if (++insertCount % batchSize == 0) {
                    ps.executeBatch();
                }
            }
            ps.executeBatch();

        } catch (SQLException e) {
            e.printStackTrace();
            System.exit(1);
        }
    finally {
        try {
            ps.close();
            conn.close();
        } catch (SQLException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }
}
8zzbczxx

8zzbczxx4#

如果您的记录大小小于或等于1000,则以下代码优于您的两个代码:

StringBuilder query = new StringBuilder("INSERT INTO table (id, name, value) VALUES ");

if (records.size() <= 1000) {
    
    for (int i = 0; i < records.size(); i++)
        query.append("(?, ?, ?), ");

    query = new StringBuilder(query.substring(1, query.length() - 1));

    PreparedStatement ps = connection.prepareStatement(query.toString());

    for (int i = 0; i < records.size(); i++) {
        ps.setInt((i * 3) + 1, record.id);
        ps.setString((i * 3) + 2, record.name);
        ps.setInt((i * 3) + 3, record.value);
    }
    
    ps.executeUpdate();
    
}

这样,您就可以使用PreparedStatement,并根据记录列表的大小在一个插入查询中使用多个值子句动态创建它

e0bqpujr

e0bqpujr5#

public void insertBatch(final List<Record > records ) {

    String query = "INSERT INTO table (id, name, value) VALUES (?, ?, ?)";

    GenericDAO.getJdbcTemplate().batchUpdate(sql, new BatchPreparedStatementSetter() {

        @Override
        public void setValues(PreparedStatement ps, int i) throws SQLException {
              Record record = records .get(i);
              ps.setInt(1, record.id);
              ps.setString(2, record.name);
              ps.setInt(3, record.value);
        }

        @Override
        public int getBatchSize() {
            return records.size();
        }
    });
}
7fyelxc5

7fyelxc56#

我发现这个问题,而试图上传一个大的csv到我的远程数据库。根据我的经验,两者并不相同。第一种方法,使用批处理是更安全的解决方案,但我也发现它要慢得多。
我认识到,这不是实现接受用户输入的东西的推荐方式,但对于我的用例,这是我找到的第一个易于处理的解决方案。
我的用例是这样的:我有一个包含2100万行的civ,我想将它上传到我的数据库中,我发现使用预处理语句批处理要慢得多。据我所知,这是因为每个插入语句都单独添加到批处理中,因此批处理1000执行1000个插入语句。不管是什么情况,这对我来说每批大约需要30秒,批量大小为1000。有2100万行,这将需要两天。因此,我故意做了一些不安全的事情,工作得更快。
当我这样做的时候,每批1万只大约需要0. 25秒。这运行足够(1000x)快比其他方式,我认为我会分享给人们正在寻找相同的答案,我是。
作为参考,我使用的CSV文件是从https://fdc.nal.usda.gov/download-datasets.html下载的。
readCSV函数来自Jeronimo Backes的解决方案:Slow CSV row parsing and splitting . https://mvnrepository.com/artifact/com.univocity/univocity-parsers

同样,对于担心注入攻击的解决方案,不要这样做。我用它来快速阅读csv文件并将其放入mysql数据库。

public static void writeFileToCSV(String filename) {
    

        PreparedStatement stmt = null;
        Connection con = null;
        Integer insertCount = 0;
        Integer batchSize = 10000;
        Integer numBatchesSent = 0;
        IterableResult<String[], ParsingContext> rows = readCSV(filename);
        
    
        try {
            con = mySqlDatabase.getConnection();
            con.setAutoCommit(true); //I found that this did not significantly change the speed of the queries because there was not one query per row.
    
            Integer numInBatch = 0;
            ArrayList<Object> values = new ArrayList<>();
            String sqlStart = "INSERT IGNORE INTO food_nutrient(id, fdc_id, nutrient_id, amount) VALUES ";
            String sqlEnd = " ;";

            StringBuilder sqlBuilder = new StringBuilder();
            StringBuilder valuesBuilder = new StringBuilder();
    
            Integer lineNum = 0;
    
//This is my manual parsing of the csv columns. There may be a slicker way to do this
            for (String[] nextLine : rows) {

                if (lineNum == 0) { //This ignores the header row of the csv.
                    lineNum++;
                    continue;
                }
                
                Integer id = Integer.parseInt(nextLine[0]);
                Integer fdc_id = Integer.parseInt(nextLine[1]);
                Integer nutrient_id = Integer.parseInt(nextLine[2]);
                Double amount = parseDouble(nextLine[3]);
                if (valuesBuilder.length() > 0) {
                    valuesBuilder.append(',');
                }
                valuesBuilder.append("(");
                valuesBuilder.append(id);
                valuesBuilder.append(',');
    
                if (fdc_id != null) {
                    valuesBuilder.append(fdc_id + ',');
                    valuesBuilder.append(',');
                } else {
                    valuesBuilder.append("NULL,");
                }
                if (nutrient_id != null) {
                    valuesBuilder.append(nutrient_id);
                    valuesBuilder.append(',');
                } else {
                    valuesBuilder.append("NULL,");
                }
                if (amount != null) {
                    valuesBuilder.append(amount);
                    valuesBuilder.append(')');
                } else {
                    valuesBuilder.append("NULL)");
                }
    
    
    
                if (++insertCount % batchSize == 0) {
                    
                    sqlBuilder.append(sqlStart);
                    sqlBuilder.append(valuesBuilder);
                    sqlBuilder.append(sqlEnd);
                    stmt = con.prepareStatement(sqlBuilder.toString());
                    sqlBuilder = new StringBuilder();
                    valuesBuilder = new StringBuilder();
                    stmt.executeUpdate();
                    numBatchesSent++;
    
                    System.out.println("Sent batch "+ numBatchesSent + " with " + batchSize + " new rows.");
                    
                } //: send the batch
            }//: For each row in the csv
    
            //Send the values that were in the last batch.
            sqlBuilder.append(sqlStart);
            sqlBuilder.append(valuesBuilder);
            sqlBuilder.append(sqlEnd);
            stmt = con.prepareStatement(sqlBuilder.toString());
            sqlBuilder = new StringBuilder();
            valuesBuilder = new StringBuilder();
            stmt.executeUpdate();
            
    
        } catch (SQLException ex) {
            System.out.println(ex.getMessage());
            ex.printStackTrace();
        } catch (DataAccessException ex) {
            ex.printStackTrace();
        }
    
        finally {
            try {
                System.out.println(stmt);
                stmt.executeBatch();
            } catch (Exception ex) {
                ex.printStackTrace(); //Push the remainder through to the database.
            }
            try {
                if (stmt != null) {
                    stmt.close();
                }
                if (con != null) {
                    con.close();
                }
                rows.getContext().stop();
            } catch (SQLException e) {
                e.printStackTrace();
            }
        }
    
    }
    

    public static IterableResult<String[], ParsingContext> readCSV(String filePath) {
        File file = new File(filePath);
        //configure the parser here. By default all values are trimmed
        CsvParserSettings parserSettings = new CsvParserSettings();
        //create the parser
        CsvParser parser = new CsvParser(parserSettings);
        //create an iterable over rows. This will not load everything into memory.
        IterableResult<String[], ParsingContext> rows = parser.iterate(file);
        return rows;
    }
    

  //This just takes care of NumberFormatExceptions that I had been getting.
    private static Double parseDouble(String str) {
        try {
            Double value=Double.parseDouble(str);
            return value;
        } catch (NumberFormatException ex) {
            System.out.println("There was probably a null value");
            return null;
        }
    }
a2mppw5e

a2mppw5e7#

我想这个可以

String query = "INSERT INTO table (id, name, value) VALUES ";
for (Record record : records)
{
query += "(" + record.id + ",'" + record.name + "'," + record.value + "),";
query = query.substring(1, query.length() - 1);
PreparedStatement ps = connection.prepareStatement(query);
ps.executeUpdate();
}

因为你必须对每个要插入数据库的记录执行查询。

相关问题