我正在尝试将我的PB模型转换为tenrorrt uff模型,使其在tx2上工作。在
我使用了来自Nvidia论坛的示例代码并尝试执行它。 我知道这是一些与python包相关的问题,但我无法解决它
我得到以下错误与下面的代码
import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
import tensorrt as trt
from tensorrt.parsers import uffparser
import pycuda.driver as cuda
import pycuda.autoinit
import numpy as np
from random import randint # generate a random test case
from PIL import Image
from matplotlib.pyplot import imshow #to show test case
import time #import system tools
import os
import uff
uff_model = uff.from_tensorflow_frozen_model('./rcnn_3chan.pb', ['image_tensor','detection_scores','detection_boxes' ,'detection_classes' ,'num_detections'])
parser.register_output("image_tensor")
parser.register_output("detection_scores")
parser.register_output("detection_boxes")
parser.register_output("detection_classes")
parser.register_output("num_detections")
^{pr2}$
我的环境变量:
export PATH=$PATH:/usr/local/cuda-9.0/bin/:/home/fts/Downloads/TensorRT-3.0.4/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/fts/Downloads/TensorRT-3.0.4/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda-9.0/lib64:/usr/local/lib/x86_64-linux-gnu:/usr/local/cuda-9.0/lib64:/usr/local/lib/x86_64-linux-gnu
export CUDA_ROOT=/usr/local/cuda-9.0
export PATH=/usr/local/cuda-9.0/bin${PATH:+:${PATH}}
export LD_LIBRARY_PATH=/usr/local/cuda-9.0/lib64${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}}
export CUDA_HOME=/usr/local/cuda-9.0
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/extras/CUPTI/lib64
仅供记录(Python新手) 我把我的py文件名命名为uff
相关问题 更多 >
编程相关推荐