我不得不从一个文本文件中插入大约200万行。
和与插入我需要创建一些主表。
什么是插入这样一个大的数据集到SQL Server的最佳和快速的方式?
解决方案我认为它更好的阅读文本文件中的数据集的数据
试用使用SqlBulkCopy - 的批量插入从C#应用
SQL //连接到SQL 使用(SqlConnection的连接= 新的SqlConnection(CONNSTRING)) { //确保使触发器 //更多的触发下一篇文章中 SqlBulkCopy的bulkCopy = 新SqlBulkCopy的 ( 连接, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.FireTriggers | SqlBulkCopyOptions.UseInternalTransaction, 空值 ); //设置目标表名 bulkCopy.DestinationTableName = this.tableName; connection.Open(); //写在dataTable的数据 bulkCopy.WriteToServer(dataTable的); 的Connection.close(); } //复位 this.dataTable.Clear();
或
做的步骤1在顶部后
您可以查看这篇文章的细节:数据的批量插入使用C#的DataTable和SQL服务器的OpenXML功能
但它不是200万的记录进行测试,它会做,但消耗内存的机器,你必须负载2万人次的纪录,并将其插入。
I have to insert about 2 million rows from a text file.
And with inserting I have to create some master tables.
What is the best and fast way to insert such a large set of data into SQL Server?
解决方案I think its better you read data of text file in DataSet
Try out SqlBulkCopy - Bulk Insert into SQL from C# App
// connect to SQL using (SqlConnection connection = new SqlConnection(connString)) { // make sure to enable triggers // more on triggers in next post SqlBulkCopy bulkCopy = new SqlBulkCopy ( connection, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.FireTriggers | SqlBulkCopyOptions.UseInternalTransaction, null ); // set the destination table name bulkCopy.DestinationTableName = this.tableName; connection.Open(); // write the data in the "dataTable" bulkCopy.WriteToServer(dataTable); connection.Close(); } // reset this.dataTable.Clear();or
after doing step 1 at the top
you can check this article for detail : Bulk Insertion of Data Using C# DataTable and SQL server OpenXML function
But its not tested with 2 million record, it will do but consume memory on machine as you have to load 2 million record and insert it.
更多推荐
刀片2万行到SQL Server快速
发布评论