在grpcpython中处理异步流请求

2024-09-20 06:46:06 发布

您现在位置:Python中文网/ 问答频道 /正文

我试图理解如何使用双向流(使用pythonapi)处理grpcapi。在

假设我有以下简单的服务器定义:

syntax = "proto3";
package simple;

service TestService {
  rpc Translate(stream Msg) returns (stream Msg){}
}

message Msg
{
 string msg = 1;
}

假设将从客户端发送的消息是异步的(因为用户选择了一些ui元素)。在

为客户机生成的python存根将包含一个方法Translate,该方法将接受生成器函数并返回迭代器。在

我不清楚的是,如何编写生成器函数,在用户创建消息时返回这些消息。在等待消息时睡在线程上听起来不是最好的解决方案。在


Tags: 方法函数用户服务器消息packagepythonapistream
1条回答
网友
1楼 · 发布于 2024-09-20 06:46:06

现在这有点笨拙,但是您可以按照以下方式完成您的用例:

#!/usr/bin/env python

from __future__ import print_function

import time
import random
import collections
import threading

from concurrent import futures
from concurrent.futures import ThreadPoolExecutor
import grpc

from translate_pb2 import Msg
from translate_pb2_grpc import TestServiceStub
from translate_pb2_grpc import TestServiceServicer
from translate_pb2_grpc import add_TestServiceServicer_to_server


def translate_next(msg):
    return ''.join(reversed(msg))


class Translator(TestServiceServicer):
  def Translate(self, request_iterator, context):
    for req in request_iterator:
      print("Translating message: {}".format(req.msg))
      yield Msg(msg=translate_next(req.msg))

class TranslatorClient(object):
  def __init__(self):
    self._stop_event = threading.Event()
    self._request_condition = threading.Condition()
    self._response_condition = threading.Condition()
    self._requests = collections.deque()
    self._last_request = None
    self._expected_responses = collections.deque()
    self._responses = {}

  def _next(self):
    with self._request_condition:
      while not self._requests and not self._stop_event.is_set():
        self._request_condition.wait()
      if len(self._requests) > 0:
        return self._requests.popleft()
      else:
        raise StopIteration()

  def next(self):
    return self._next()

  def __next__(self):
    return self._next()

  def add_response(self, response):
    with self._response_condition:
      request = self._expected_responses.popleft()
      self._responses[request] = response
      self._response_condition.notify_all()

  def add_request(self, request):
    with self._request_condition:
      self._requests.append(request)
      with self._response_condition:
        self._expected_responses.append(request.msg)
      self._request_condition.notify()

  def close(self):
    self._stop_event.set()
    with self._request_condition:
      self._request_condition.notify()

  def translate(self, to_translate):
    self.add_request(to_translate)
    with self._response_condition:
      while True:
        self._response_condition.wait()
        if to_translate.msg in self._responses:
          return self._responses[to_translate.msg]


def _run_client(address, translator_client):
  with grpc.insecure_channel('localhost:50054') as channel:
    stub = TestServiceStub(channel)
    responses = stub.Translate(translator_client)
    for resp in responses:
      translator_client.add_response(resp)

def main():
  server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
  add_TestServiceServicer_to_server(Translator(), server)
  server.add_insecure_port('[::]:50054')
  server.start()
  translator_client = TranslatorClient()
  client_thread = threading.Thread(
      target=_run_client, args=('localhost:50054', translator_client))
  client_thread.start()

  def _translate(to_translate):
    return translator_client.translate(Msg(msg=to_translate)).msg

  translator_pool = futures.ThreadPoolExecutor(max_workers=4)
  to_translate = ("hello", "goodbye", "I", "don't", "know", "why",)
  translations = translator_pool.map(_translate, to_translate)
  print("Translations: {}".format(zip(to_translate, translations)))

  translator_client.close()
  client_thread.join()
  server.stop(None)


if __name__ == "__main__":
  main()

基本思想是让一个名为TranslatorClient的对象在一个单独的线程上运行,将请求和响应关联起来。它期望响应将按照发出请求的顺序返回。它还实现了迭代器接口,以便您可以直接将其传递给存根上的Translate方法的调用。在

我们启动一个运行_run_client的线程,该线程从TranslatorClient中提取响应,并在另一端用add_response反馈它们。在

我在这里包含的main函数实际上只是一个临时工,因为我没有您的UI代码的详细信息。我在一个ThreadPoolExecutor中运行_translate,以证明即使translator_client.translate是同步的,它也会产生结果,允许您一次有多个正在处理的请求。在

我们认识到,要为这样一个简单的用例编写大量代码。最终,答案将是asyncio支持。在不久的将来,我们有这个计划。但目前,无论您运行的是python2还是python3,这种解决方案都应该让您继续工作。在

相关问题 更多 >