如何将TPU与PyTorch一起使用?

2024-10-03 17:21:39 发布

您现在位置:Python中文网/ 问答频道 /正文

我试图使用pytorch_xla使用TPU,但它在_XLAC中显示了导入错误

!curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
!python pytorch-xla-env-setup.py --version $VERSION

import torch_xla
import torch_xla.core.xla_model as xm

ImportError                               Traceback (most recent call last)
<ipython-input-60-6a19e980152f> in <module>()
----> 1 import torch_xla
      2 import torch_xla.core.xla_model as xm

/usr/local/lib/python3.6/dist-packages/torch_xla/__init__.py in <module>()
     39 import torch
     40 from .version import __version__
---> 41 import _XLAC
     42 
     43 _XLAC._initialize_aten_bindings()

ImportError: /usr/local/lib/python3.6/dist-packages/_XLAC.cpython-36m-x86_64-linux-gnu.so: undefined symbol: _ZN2at6native6einsumENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN3c108ArrayRefINS_6TensorEEE

Tags: inpycoreimportenvmodelversionas
2条回答

请试试这个:

!pip uninstall -y torch
!pip install torch==1.8.2+cpu -f https://download.pytorch.org/whl/lts/1.8/torch_lts.html
!pip install -q cloud-tpu-client==0.10 https://storage.googleapis.com/tpu-pytorch/wheels/torch_xla-1.8-cp37-cp37m-linux_x86_64.whl
import torch_xla

这对我有用

资料来源:googlecolab/colabtools#2237

  1. 确保您使用的是正确版本的pytorch xla和Python(3.6.9工作正常):
curl https://raw.githubusercontent.com/pytorch/xla/master/contrib/scripts/env-setup.py -o pytorch-xla-env-setup.py
python pytorch-xla-env-setup.py  version 20200325
  1. 检查您是否指定了如何访问TPU。根据您的环境,您可能需要设置“XRT\U TPU\U配置”或“COLAB\U TPU\U ADDR”

比如:

export XRT_TPU_CONFIG="tpu_worker;0;$TPU_IP_ADDRESS:8470"

或:

export COLAB_TPU_ADDR="10.16.26.36:8676"

下面是详细的描述:https://github.com/pytorch/xla/blob/master/README.md和示例https://cloud.google.com/tpu/docs/tutorials/transformer-pytorch

另外,这是PyTorch团队创建的Google Colab笔记本,我刚刚测试过,它工作正常,没有任何改动: https://colab.research.google.com/github/pytorch/xla/blob/master/contrib/colab/getting-started.ipynb

This notebook will show you how to:

  • Install PyTorch/XLA on Colab, which lets you use PyTorch with TPUs.
  • Run basic PyTorch functions on TPUs.
  • Run PyTorch modules and autograd on TPUs.
  • Run PyTorch networks on TPUs.

您可能希望遵循其中一个示例,并尝试重现该问题。祝你好运

相关问题 更多 >