Oracle的文档中有一个关于如何在Java中插入大型BLOB的示例。 是否有可能在C#中做同样的事情?
Oracle's documentation has a sample on how to insert large BLOBs, in Java. Is it possible to do the same in C#?
最满意答案
一个流的答案:
string path = @"D:\testfile.txt"; System.IO.FileStream myStream = new System.IO.FileStream(@path, FileMode.Open); BinaryReader binaryReader = new BinaryReader(myStream); byte[] data = binaryReader.ReadBytes((int)myStream.Length); //read the stream into byte String sql = "INSERT INTO testblob (testid, testblob) VALUES (100, :blobtodb)"; OracleCommand cmd = new OracleCommand(); cmd.CommandText = sql; // Set the sql-command cmd.Connection = con; //con is an OracleConnection, create it before OracleParameter param = cmd.Parameters.Add("blobtodb", OracleDbType.Blob); //Add the parameter for the blobcolumn param.Direction = ParameterDirection.Input; param.Value = data; //Asign the Byte Array to the parameter cmd.ExecuteNonQuery(); //You are done!这适用于每个流;)
对于(超过)1TB的文件:
警告:对大文件使用OPs解决方案,因为内存中很难有1TB;)
老实说,我不知道,我不能用这么大的文件在这里试试。 但是,Oracle指定 BLOB为: Maximum size: (4 GB - 1) * DB_BLOCK_SIZE initialization parameter (8 TB to 128 TB) 。 所以从数据库方面来说它应该可行。 但是你可能需要注意Connection Timeout和Connection Lifetime ,因为通过网络传输这么大的文件需要很长时间(我想,我不知道你的设置)。
A answer with a stream:
string path = @"D:\testfile.txt"; System.IO.FileStream myStream = new System.IO.FileStream(@path, FileMode.Open); BinaryReader binaryReader = new BinaryReader(myStream); byte[] data = binaryReader.ReadBytes((int)myStream.Length); //read the stream into byte String sql = "INSERT INTO testblob (testid, testblob) VALUES (100, :blobtodb)"; OracleCommand cmd = new OracleCommand(); cmd.CommandText = sql; // Set the sql-command cmd.Connection = con; //con is an OracleConnection, create it before OracleParameter param = cmd.Parameters.Add("blobtodb", OracleDbType.Blob); //Add the parameter for the blobcolumn param.Direction = ParameterDirection.Input; param.Value = data; //Asign the Byte Array to the parameter cmd.ExecuteNonQuery(); //You are done!This should work with every stream ;)
For Files with (over) 1TB:
WARNING: Use the OPs Solution for big files, as it is really hard to have 1TB in Memory ;)
Honestly I don't know, I can't try it here with such a big file. However BLOBs are specified by Oracle with: Maximum size: (4 GB - 1) * DB_BLOCK_SIZE initialization parameter (8 TB to 128 TB). So from database-side it should work. But you MAYBE need to take care about the Connection Timeout and the Connection Lifetime, as it will take a long time to transfer such a big file over the network (I guess, I don't know your setup).
更多推荐
发布评论