Python环球网(dir)存储器E

2024-10-01 22:38:15 发布

您现在位置:Python中文网/ 问答频道 /正文

我正在处理一个有数百万个文件的文件夹时的内存问题。有人知道如何克服这种情况吗?有没有办法限制glob要搜索的文件?所以可以分块执行?在

Traceback (most recent call last):
  File "./lb2_lmanager", line 533, in <module>
   main(sys.argv[1:])
 File "./lb2_lmanager", line 318, in main
    matched = match_files(policy.directory, policy.file_patterns)
  File "./lb2_lmanager", line 32, in wrapper
    res = func(*args, **kwargs)
  File "./lb2_lmanager", line 380, in match_files
    listing = glob.glob(directory)
  File "/usr/lib/python2.6/glob.py", line 16, in glob
    return list(iglob(pathname))
  File "/usr/lib/python2.6/glob.py", line 43, in iglob
    yield os.path.join(dirname, name)
  File "/usr/lib/python2.6/posixpath.py", line 70, in join
   path += '/' + b
MemoryError

Tags: 文件inpymainlibusrmatchpolicy

热门问题