如何从AWS PySpark中加载的python包中加载本地资源

2024-10-02 02:28:07 发布

您现在位置:Python中文网/ 问答频道 /正文

我已经用PySpark将python包上传到AWS EMR中。我的python包的结构如下所示,包中有一个资源文件(sklearn-joblib模型):

myetllib
    ├── Dockerfile
    ├── __init__.py
    ├── modules
    │   ├── bin
    │   ├── joblib
    │   ├── joblib-0.14.1.dist-info
    │   ├── numpy
    │   ├── numpy-1.18.4.dist-info
    │   ├── numpy.libs
    │   ├── scikit_learn-0.21.3.dist-info
    │   ├── scipy
    │   ├── scipy-1.4.1.dist-info
    │   └── sklearn
    ├── requirements.txt
    └── mysubmodule
        ├── __init__.py
        ├── model.py
        └── models/mymodel.joblib

然后,我压缩包并上传到EMR。现在我可以在控制台中导入model.py,就像

from myetllib.mysubmodule.model import load_model, run_model

但是,当我调用load_model时,我得到一个错误,joblib抱怨找不到包资源文件,即models/mymodel.joblib 路径设置正确,如下所示

import joblib

BASE_PATH = os.path.join(os.path.dirname(os.path.realpath(__file__)))
MODEL_PATH =  os.path.join(BASE_PATH,"models/my_model.joblib")

def load_model():
    '''
        load scikit-learn model via joblib
    '''
    with warnings.catch_warnings():
        warnings.filterwarnings('ignore', category=UserWarning)
        return joblib.load(MODEL_PATH)

错误是这样的

NotADirectoryError: [Errno 20] Not a directory: '/mnt/tmp/spark-fc45e56b-06f3-56dd-af44-0ecc93d4gc0d/userFiles-1e3455-a6rf-4adc-592b-bbe41ffa323/etllib-v1.0.0.zip/etllib/mysubmodule/models/my_model.joblib

另外,我从sklearn得到另一个错误:

NotADirectoryError: [Errno 20] Not a directory: '/mnt/tmp/spark-904b50d2-0407-43e8-bb46-06a7b334a46b/userFiles-5df387de-066e-498a-8dd3-e8329d0e8252/etllib-v1.0.1.zip/etllib/modules/sklearn/__check_build

Tags: pathpyinfonumpymodelosmodelsdist

热门问题