在python中腌制数据时出现MemoryError

问题描述:

我正在尝试使用python中提供的'dump'命令将字典转储为pickle格式.字典的文件大小约为150 mb,但仅转储115 mb的文件时会发生异常.例外是:

I am trying to dump a dictionary into pickle format, using 'dump' command provided in python. The file size of the dictionary is around 150 mb, but an exception occurs when only 115 mb of the file is dumped. The exception is:

Traceback (most recent call last): 
  File "C:\Python27\generate_traffic_pattern.py", line 32, in <module> 
    b.dump_data(way_id_data,'way_id_data.pickle') 
  File "C:\Python27\class_dump_load_data.py", line 8, in dump_data 
    pickle.dump(data,saved_file) 
  File "C:\Python27\lib\pickle.py", line 1370, in dump 
    Pickler(file, protocol).dump(obj) 
  File "C:\Python27\lib\pickle.py", line 224, in dump 
    self.save(obj) 
  File "C:\Python27\lib\pickle.py", line 286, in save 
    f(self, obj) # Call unbound method with explicit self 
  File "C:\Python27\lib\pickle.py", line 649, in save_dict 
    self._batch_setitems(obj.iteritems()) 
  File "C:\Python27\lib\pickle.py", line 663, in _batch_setitems 
    save(v) 
  File "C:\Python27\lib\pickle.py", line 286, in save 
    f(self, obj) # Call unbound method with explicit self 
  File "C:\Python27\lib\pickle.py", line 600, in save_list 
    self._batch_appends(iter(obj)) 
  File "C:\Python27\lib\pickle.py", line 615, in _batch_appends 
    save(x) 
  File "C:\Python27\lib\pickle.py", line 286, in save 
    f(self, obj) # Call unbound method with explicit self 
  File "C:\Python27\lib\pickle.py", line 599, in save_list 
    self.memoize(obj) 
  File "C:\Python27\lib\pickle.py", line 247, in memoize 
    self.memo[id(obj)] = memo_len, obj 
MemoryError

我真的很困惑,因为我的相同代码之前运行良好.

I am really confused, since my same code was working fine earlier.

您尝试过吗?

import cPickle as pickle
p = pickle.Pickler(open("temp.p","wb")) 
p.fast = True 
p.dump(d) # d is your dictionary