我用python来重写几百MB的文件。 它很快。 我的mac的内存是16 GB 1600 MHz DDR3,处理器是2.5 GHz Intel Core i7。
但这是问题所在。 当我想重写另一个文件。 突然间,我甚至无法打开具有数百MB数据流畅的文件。 处理速度很慢。
是因为我没有释放内存吗? 为什么我的mac变得如此慢,甚至只是在我重写一些文件后打开一个文件?
仅供参考,我使用textmate来编写python。 我对python真的很陌生。
I use python to rewrite several hundreds-of-MB files. And it's very quick. Memory of my mac is 16 GB 1600 MHz DDR3, processor is 2.5 GHz Intel Core i7.
But here is the problem. When I want to rewrite another file. Suddenly, I can't even open the file which has several hundreds of MB data fluently. And the processing is quite slow.
Does it because I haven't released the memory? Why does my mac become so slow, even just open a file after I rewrite some files?
FYI, I use textmate to write python. And I am really new to python.
最满意答案
Python有一个内置的垃圾收集器,可以自动释放你不再需要的变量所使用的内存。 所以,这应该不是内存的问题。
但是如果你没有关闭这些文件,它们仍然是开放的,它们的一些内容在内存中(Python解释器认为你将使用它们)。
首先,检查您是否正确关闭文件
f=open("file.txt") a=f.read() f.close()甚至更好
with open("file.txt") as input: data=input.read()这里文件自动关闭。
如果你的一些变量变得非常大,你可以手动删除它: del data
Python has got a built-in garbage collector, that automatically frees memory used by variables you don't need any more. So, that shouldn't be a problem with memory.
But if you're not closing these files they remain open and some of their contents is in memory (Python interpreter thinks you're going to use them).
First of all, check whether you're closing your files properly
f=open("file.txt") a=f.read() f.close()Or even better
with open("file.txt") as input: data=input.read()Here the file's closed automatically.
If some of your variables gets very big you can delete it manually: del data
更多推荐
发布评论