_send中的BrokenPipe错误文件“ /usr/lib/python3.6/multiprocessing/connection.py”,第368行

问题描述

我是一个使用模型DistilBertForQuestionAnswering的聊天机器人,但在处理问题期间出现“管道破裂错误”

self._writer.send_bytes(obj)
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]:   File "/usr/lib/python3.6/multiprocessing/connection.py",line 200,in send_bytes
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]:     self._send_bytes(m[offset:offset + size])
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]:   File "/usr/lib/python3.6/multiprocessing/connection.py",line 404,in _send_bytes
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]:     self._send(header + buf)
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]:   File "/usr/lib/python3.6/multiprocessing/connection.py",line 368,in _send
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]:     n = write(self._handle,buf)
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: BrokenPipeError: [Errno 32] Broken pipe

我尝试了多处理,这是我的代码

from transformers import pipeline
from functools import lru_cache

import multiprocessing
import codecs

class Model():

    context = "./sw_merge.txt"

    @lru_cache(maxsize=10000)
    def __init__(self):
        print('processing - iniit in model')
        self.model = pipeline('question-answering')
        with codecs.open(self.context,'rb',errors = 'ignore',encoding='utf-8') as f:
            self.lines = f.read()

    def run_qa(self,qn):
        print('run_qa - on processing')
        ans = self.model(context = self.lines,question = qn)
        return ans


class Conversation():
    #incoming messages - receives an input from the user
    def incoming(self,question):
        usr_qn = []
        usr_qn.append(question)
        return usr_qn

    #model prediction
    def model_ans(self,input_qn):
        y = Model()
        ans = y.run_qa(input_qn)
        ans_text = ans.get("answer")
        print('model_ans - running')
        return ans_text




if __name__ == '__main__':

    p1 = multiprocessing.Process(target=Conversation)
    p2 = multiprocessing.Process(target=Model)

    p1.start()
    p2.start()

    p1.join()
    p2.join()


    # check if processes are alive 
    print("Process p1 is alive: {}".format(p1.is_alive())) 
    print("Process p2 is alive: {}".format(p2.is_alive()))

    # question = ''
    # chat_conv = Conversation()

    # incoming_text = chat_conv.incoming(question)
    # outgoing_text = chat_conv.model_ans(incoming_text)

我的环境是:

  • tensorflow == 2.3.0
  • transformers == 3.0.2
  • Python 3.6

函数model_ans接受问题并返回问题,因此应用程序的执行从model_ans开始,执行在带有管道错误的类Model()中结束。

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

相关问答

依赖报错 idea导入项目后依赖报错,解决方案:https://blog....
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下...
错误1:gradle项目控制台输出为乱码 # 解决方案:https://bl...
错误还原:在查询的过程中,传入的workType为0时,该条件不起...
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct...