跨多进程共享 PyTable

问题描述

我创建了一个 PyTable 对象 W_hat,进程应该在其中共享和保存结果,而不是返回它们。

from multiprocessing import Lock
from multiprocessing import Pool
import tables as tb


def parallel_l21(labels,X,lam,g,W_hat):
    g_indxs = np.where(labels == g)[0]
    tmp = rfs(X[g_indxs,1:].T,X[:,:-1].T,gamma=lam,verbose=False).T
    tmp[abs(tmp) <= 1e-6] = 0
    with lock:
        W_hat[:,g_indxs] = np.array(tmp)


def init_child(lock_):
    global lock
    lock = lock_

#PrevIoUs code is omitted. 
n_ = X_test.shape[0]
tb.file._open_files.close_all()
f = tb.open_file(path_name + 'dot' + sub_num + str(lam) + '.h5','w')
filters = tb.Filters(complevel=5,complib='blosc')
W_hat = f.create_carray(f.root,'data',tb.Float32Atom(),shape=(n_,n_),filters=filters)
W_hats = []
for i in np.unique(labels):
     W_hats.append(W_hat)
lock = Lock()
with Pool(processes=cpu_count,initializer=init_child,initargs=(lock,)) as pool:
     print(pool)
     pool.starmap(parallel_l21,zip(repeat(labels),repeat(X),repeat(lam),np.unique(labels),W_hats))

现在,当遇到 starmap 时,会出现此错误

Traceback (most recent call last):
  File "/Applications/PyCharm CE 2.app/Contents/plugins/python-ce/helpers/pydev/_pydevd_bundle/pydevd_exec2.py",line 3,in Exec
    exec(exp,global_vars,local_vars)
  File "<input>",line 1,in <module>
  File "/usr/local/Cellar/python@3.8/3.8.6_1/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/pool.py",line 372,in starmap
    return self._map_async(func,iterable,starmapstar,chunksize).get()
  File "/usr/local/Cellar/python@3.8/3.8.6_1/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/pool.py",line 771,in get
    raise self._value
  File "/usr/local/Cellar/python@3.8/3.8.6_1/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/pool.py",line 537,in _handle_tasks
    put(task)
  File "/usr/local/Cellar/python@3.8/3.8.6_1/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/connection.py",line 206,in send
    self._send_bytes(_ForkingPickler.dumps(obj))
  File "/usr/local/Cellar/python@3.8/3.8.6_1/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/reduction.py",line 51,in dumps
    cls(buf,protocol).dump(obj)
  File "stringsource",line 2,in tables.hdf5extension.Array.__reduce_cython__
TypeError: self.dims,self.dims_chunk,self.maxdims cannot be converted to a Python object for pickling

注意:我认为代码在 Python 3.6.8 上运行良好,但事实证明并非如此。

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)