如何解决_send中的BrokenPipe错误文件“ /usr/lib/python3.6/multiprocessing/connection.py”,第368行
我是一个使用模型DistilBertForQuestionAnswering的聊天机器人,但在处理问题期间出现“管道破裂错误”
self._writer.send_bytes(obj)
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: File "/usr/lib/python3.6/multiprocessing/connection.py",line 200,in send_bytes
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: self._send_bytes(m[offset:offset + size])
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: File "/usr/lib/python3.6/multiprocessing/connection.py",line 404,in _send_bytes
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: self._send(header + buf)
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: File "/usr/lib/python3.6/multiprocessing/connection.py",line 368,in _send
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: n = write(self._handle,buf)
Aug 25 15:02:57 ubuntu-s-4vcpu-8gb-fra1-01 gunicorn[8237]: BrokenPipeError: [Errno 32] Broken pipe
我尝试了多处理,这是我的代码
from transformers import pipeline
from functools import lru_cache
import multiprocessing
import codecs
class Model():
context = "./sw_merge.txt"
@lru_cache(maxsize=10000)
def __init__(self):
print('processing - iniit in model')
self.model = pipeline('question-answering')
with codecs.open(self.context,'rb',errors = 'ignore',encoding='utf-8') as f:
self.lines = f.read()
def run_qa(self,qn):
print('run_qa - on processing')
ans = self.model(context = self.lines,question = qn)
return ans
class Conversation():
#incoming messages - receives an input from the user
def incoming(self,question):
usr_qn = []
usr_qn.append(question)
return usr_qn
#model prediction
def model_ans(self,input_qn):
y = Model()
ans = y.run_qa(input_qn)
ans_text = ans.get("answer")
print('model_ans - running')
return ans_text
if __name__ == '__main__':
p1 = multiprocessing.Process(target=Conversation)
p2 = multiprocessing.Process(target=Model)
p1.start()
p2.start()
p1.join()
p2.join()
# check if processes are alive
print("Process p1 is alive: {}".format(p1.is_alive()))
print("Process p2 is alive: {}".format(p2.is_alive()))
# question = ''
# chat_conv = Conversation()
# incoming_text = chat_conv.incoming(question)
# outgoing_text = chat_conv.model_ans(incoming_text)
我的环境是:
- tensorflow == 2.3.0
- transformers == 3.0.2
- Python 3.6
函数model_ans
接受问题并返回问题,因此应用程序的执行从model_ans
开始,执行在带有管道错误的类Model()
中结束。
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。