我不知道发生了什么事。该体系结构是,Spring Boot(Java)后端将请求(映像)发送到另一个Spring Boot服务器,该服务器将请求发送到Flask,以预测它是猫还是狗。当我手动启动每台服务器时,它工作正常。但作为一个docker复合体(docker compose),只有在这种猫或狗的方法中,flask以某种方式失败了。这是同样的方法。我们还得到了一个OCR方法,我们在Spring Boot中做了完全相同的事情,效果很好。所以它必须和烧瓶和这个方法有关。。。但是什么呢
@app.route('/cat-or-dog/predict', methods=['POST'])
def image_classifier_cat_or_dog():
img_codes = json.loads(request.form['images'])
# Decoding base64 images
imgs = []
for img_code in img_codes:
imgs.append(image.img_to_array(image.load_img(BytesIO(base64.b64decode(img_code)),
target_size=(150, 150, 3))))
# Pre-processing images
X_imgs = preprocess_images(imgs)
X_imgs = np.array(X_imgs)
n_imgs = len(X_imgs)
# Predicting content of images
predictions = []
i = 0
for batch in datagen.flow(X_imgs, batch_size=1):
pred = model.predict(batch)
if pred > 0.5:
predictions.append("dog")
else:
predictions.append("cat")
i += 1
if i % n_imgs == 0:
break
return jsonify(predictions=predictions)
# Image preprocessing for cat or dog classifier
def preprocess_images(new_imgs):
X = []
for img in new_imgs:
X.append(cv2.resize(img, (150, 150), interpolation=cv2.INTER_CUBIC))
return X
错误:
aiproject-flask | ERROR:app:Exception on /cat-or-dog/predict [POST]
aiproject-flask | Traceback (most recent call last):
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 2447, in wsgi_app
aiproject-flask | response = self.full_dispatch_request()
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1952, in full_dispatch_request
aiproject-flask | rv = self.handle_user_exception(e)
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/flask_cors/extension.py", line 161, in wrapped_function
aiproject-flask | return cors_after_request(app.make_response(f(*args, **kwargs)))
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1821, in handle_user_exception
aiproject-flask | reraise(exc_type, exc_value, tb)
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/flask/_compat.py", line 39, in reraise
aiproject-flask | raise value
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1950, in full_dispatch_request
aiproject-flask | rv = self.dispatch_request()
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1936, in dispatch_request
aiproject-flask | return self.view_functions[rule.endpoint](**req.view_args)
aiproject-flask | File "/app/app.py", line 85, in image_classifier_cat_or_dog
aiproject-flask | target_size=(150, 150, 3))))
aiproject-flask | File "/usr/local/lib/python3.6/dist-packages/keras_preprocessing/image/utils.py", line 113, in load_img
aiproject-flask | with open(path, 'rb') as f:
aiproject-flask | TypeError: expected str, bytes or os.PathLike object, not _io.BytesIO
aiproject-flask | INFO:werkzeug:172.30.0.2 - - [09/Jun/2020 17:45:12] "POST /cat-or-dog/predict HTTP/1.1" 500 -
根据Keras docs,
load_img
函数只接受类似路径的对象,这与您正在经历的行为一致您正试图在容器内提供一些烧瓶应用程序,对吗?您确定您使用的是在其他地方正常运行的同一个代码分支吗
通过查看以下内容的输出,找到正确的容器名称:
使用此选项,登录到容器:
接下来,导航到应用程序目录(假设您在此处使用git…):
并检查输出中的分支名称:
然后,在原始存储库中执行同样的操作,您将“手动启动每台服务器”。比较这两个输出
我几乎可以肯定,您使用了来自过时/未更新分支或类似分支的错误代码库。想办法弄清楚
如果确定要修改所显示的代码,则可以通过替换以下语句来修复该错误:
为此:
上述变更是在this issue中提出的,作为解决与您非常相似的问题的一种方法
相关问题 更多 >
编程相关推荐