我在这里真的很糊涂。
我将大约200万行导入我的azure sql数据库。当我使用合并技术只插入没有重复的行时,我创建了一个临时表,在其中放置我的值。我的代码和脚本如下所示。
public async Task BulkImportWithoutDuplicates(DataTable reader)
{
var tableName = "##tempImport";
using (var connection = new SqlConnection(sqlCOnn.ConnectionString))
{
using (SqlCommand command = new SqlCommand("", sqlCOnn))
{
try
{
sqlCOnn.Open();
// Creating temp table on database
command.CommandText = Scripts.GetTempTableScript();
command.ExecuteNonQuery();
// Bulk insert into temp table
using (SqlBulkCopy b = new SqlBulkCopy(conString, SqlBulkCopyOptions.TableLock))
{
b.BulkCopyTimeout = 0;
b.BatchSize = reader.Rows.Count;
b.DestinationTableName = tableName;
//dataTable
await b.WriteToServerAsync(reader);
b.Close();
}
// Updating destination table, and dropping temp table
command.CommandText = Scripts.GetMergeScript();
var rows = command.ExecuteNonQuery();
}
catch (Exception ex)
{
// Handle exception properly
}
finally
{
connection.Close();
}
}
}
}
public static string GetTempTableScript()
{
return $@"
IF OBJECT_ID('tempdb.dbo.##tempImport', 'U') IS NOT NULL
BEGIN
DROP TABLE ##tempImport;
END
CREATE TABLE ##tempImport ( ... all the columns);";
}
public static string GetMergeScript()
{
return $@"MERGE INTO dbo.Data AS target
USING ##tempImport AS source
ON (source.TransactionId = target.TransactionId AND source.UserId = target.UserId)
WHEN NOT MATCHED THEN
INSERT (Start, Spend, UserId, Product, Shop, ClientId, UploadDataId, UniqueId, TransactionId, q, cq, c2)
VALUES (source.Start, source.Spend, source.UserId, source.Product, source.Shop,
source.ClientId, source.UploadDataId, source.UniqueId, source.TransactionId, source.q, source.c1, source.c2);
";
}
我真的不明白为什么要花很长时间才能完成。我等了24分钟,直到它被单独添加到临时桌上。
我正在读这篇文章,似乎不会花太长时间。https://www.adathedev.co.uk/2011/01/sqlbulkcopy-to-sql-server-in-parallel.html?m=1
我做错什么了?如何提高速度?
我两个都试过了 IDataReader
以及 DataTable
但这两个都不适合我。。。
暂无答案!
目前还没有任何答案,快来回答吧!