我试过关闭索引服务,但是没有任何区别。我也考虑将文件内容移动到数据库/ zip文件/ tarball中,但是对于我们单独访问文件是有益的;基本上,这些文件仍然是研究目的所需要的,研究人员不愿意处理其他任何事情。
有没有办法优化NTFS或Windows,以便它可以处理所有这些小文件?
解决方案
NTFS性能在目录中的10,000个文件之后严重降级。你所做的是在目录层次结构中创建一个额外的级别,每个子目录有10,000个文件。
值得一提的是,这是SVN人员所采取的方法 1.5版。他们使用了1,000个文件作为默认阈值。
A product that I am working on collects several thousand readings a day and stores them as 64k binary files on a NTFS partition (Windows XP). After a year in production there is over 300000 files in a single directory and the number keeps growing. This has made accessing the parent/ancestor directories from windows explorer very time consuming.
I have tried turning off the indexing service but that made no difference. I have also contemplated moving the file content into a database/zip files/tarballs but it is beneficial for us to access the files individually; basically, the files are still needed for research purposes and the researchers are not willing to deal with anything else.
Is there a way to optimize NTFS or Windows so that it can work with all these small files?
解决方案NTFS performance severely degrades after 10,000 files in a directory. What you do is create an additional level in the directory hierarchy, with each subdirectory having 10,000 files.
For what it's worth, this is the approach that the SVN folks took in version 1.5. They used 1,000 files as the default threshold.
更多推荐
你如何处理大量的小文件?
发布评论