将大对象插入Postgresql会返回53200内存不足错误

编程入门 行业动态 更新时间:2024-10-28 20:21:27
本文介绍了将大对象插入Postgresql会返回53200内存不足错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

PostgreSQL 9.1 NPGSQL 2.0.12

我有二进制数据我想要存储在postgresql数据库。大多数文件加载正常,但是,一个大的二进制文件(664 Mb)会导致问题。当尝试使用通过Npgsql的大对象支持将文件加载到postgresql时,postgresql服务器返回内存不足错误。

工作站与4Gb RAM,2Gb免费与postgresql运行在空闲状态。

这是我使用的代码,改编自 PG Foundry Npgsql用户手册。

using(var transaction = connection.BeginTransaction()) { try { var manager = new NpgsqlTypes.LargeObjectManager(connection); var noid = manager.Create(NpgsqlTypes.LargeObjectManager.READWRITE); var lo = manager.Open(noid,NpgsqlTypes.LargeObjectManager.READWRITE); lo.Write(BinaryData); lo.Close(); transaction.Commit(); return noid; } catch { transaction.Rollback(); throw; } }

我尝试将postgresql的内存设置从默认值修改为所有方式的价值调整:

  • shared_buffers
  • work_mem
  • maintenance_work_mem
$ b $ p

到目前为止,我发现postgresql是一个优秀的数据库系统,似乎不能得到这个大小的文件到数据库。我真的不想处理手动将文件切成块并重新创建客户端,如果我可以帮助它。

请帮助!?

解决方案

我认为答案似乎是使用字节数组的块来迭代地调用LargeObject类的Write()方法。我知道我说我不想处理数据分块,但我真正的意思是将数据分割成单独的LargeObjects。这个解决方案意味着我把数组,但它仍然存储在数据库作为一个对象,意味着我不必跟踪文件部分,只是一个oid。

do { var length = 1000; if(i + length> BinaryData.Length)length = BinaryData.Length-i; byte [] chunk = new byte [length]; Array.Copy(BinaryData,i,chunk,0,length); lo.Write(chunk,0,length); i + = length; }(i

Postgresql 9.1 NPGSQL 2.0.12

I have binary data I am wanting to store in a postgresql database. Most files load fine, however, a large binary (664 Mb) file is causing problems. When trying to load the file to postgresql using Large Object support through Npgsql, the postgresql server returns 'out of memory' error.

I'm running this at present on a workstation with 4Gb RAM, with 2Gb free with postgresql running in an idle state.

This is the code I am using, adapted from PG Foundry Npgsql User's Manual.

using (var transaction = connection.BeginTransaction()) { try { var manager = new NpgsqlTypes.LargeObjectManager(connection); var noid = manager.Create(NpgsqlTypes.LargeObjectManager.READWRITE); var lo = manager.Open(noid, NpgsqlTypes.LargeObjectManager.READWRITE); lo.Write(BinaryData); lo.Close(); transaction.Commit(); return noid; } catch { transaction.Rollback(); throw; } }

I've tried modifying postgresql's memory settings from defaults to all manner of values adjusting:

  • shared_buffers
  • work_mem
  • maintenance_work_mem

So far I've found postgresql to be a great database system, but this is a show stopper at present and I can't seem to get this sized file into the database. I don't really want to have to deal with manually chopping the file into chunks and recreating client side if I can help it.

Please help!?

解决方案

I think the answer appears to be calling the Write() method of the LargeObject class iteratively with chunks of the byte array. I know I said I didn't want to have to deal with chunking the data, but what I really meant was chunking the data into separate LargeObjects. This solution means I chunk the array, but it is still stored in the database as one object, meaning I don't have to keep track of file parts, just the one oid.

do { var length = 1000; if (i + length > BinaryData.Length) length = BinaryData.Length - i; byte[] chunk = new byte[length]; Array.Copy(BinaryData, i, chunk, 0, length); lo.Write(chunk, 0, length); i += length; } (i < BinaryData.Length)

更多推荐

将大对象插入Postgresql会返回53200内存不足错误

本文发布于:2023-10-19 11:26:50,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1507355.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:对象   错误   内存不足   Postgresql

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!