Jupyter的Azure内容管理器
cannerflow-jupyter-contents的Python项目详细描述
罐装Jupyter内容物
用于Jupyter的S3、GCS、Blob支持的ContentsManager实现。基于s3Contens
安装
$ pip install cannerflow-jupyter-contents
Jupyter配置
根据您想要的后端编辑~/.jupyter/jupyter_notebook_config.py
基于下面的例子。根据需要替换凭据。在
美国焊接学会S3
^{pr2}$play.minio.io:9000
示例:
froms3contentsimportS3ContentsManagerc=get_config()# Tell Jupyter to use S3ContentsManager for all storage.c.NotebookApp.contents_manager_class=S3ContentsManagerc.S3ContentsManager.access_key_id="Q3AM3UQ867SPQQA43P2F"c.S3ContentsManager.secret_access_key="zuf+tfteSlswRu7BJ86wekitnifILbZam1KYY3TG"c.S3ContentsManager.endpoint_url="http://play.minio.io:9000"c.S3ContentsManager.bucket="s3contents-demo"c.S3ContentsManager.prefix="notebooks/test"
AWS EC2角色验证设置
还可以从amazonec2实例使用基于IAM角色的访问s3bucket。在
为此,只需将access_key_id
和secret_access_key
设置为它们的默认值(None
),
并确保EC2实例具有IAM角色,该角色为bucket和必要的操作提供足够的权限。在
AWS密钥刷新
可选的init_s3_hook
配置可用于启用AWS键旋转(描述为here和{a3}),如下所示:
froms3contentsimportS3ContentsManagerfrombotocore.credentialsimportRefreshableCredentialsfrombotocore.sessionimportget_sessionimportbotocoreimportboto3fromconfigparserimportConfigParserdefrefresh_external_credentials():config=ConfigParser()config.read('/home/jovyan/.aws/credentials')return{"access_key":config['default']['aws_access_key_id'],"secret_key":config['default']['aws_secret_access_key'],"token":config['default']['aws_session_token'],"expiry_time":config['default']['aws_expiration']}session_credentials=RefreshableCredentials.create_from_metadata(metadata=refresh_external_credentials(),refresh_using=refresh_external_credentials,method='custom-refreshing-key-file-reader')defmake_key_refresh_boto3(this_s3contents_instance):refresh_session=get_session()# from botocore.sessionrefresh_session._credentials=session_credentialsmy_s3_session=boto3.Session(botocore_session=refresh_session)this_s3contents_instance.boto3_session=my_s3_session# Tell Jupyter to use S3ContentsManager for all storage.c.NotebookApp.contents_manager_class=S3ContentsManagerc.S3ContentsManager.init_s3_hook=make_key_refresh_boto3
GCP云存储
froms3contentsimportGCSContentsManagerc=get_config(c.NotebookApp.contents_manager_class=GCSContentsManagerc.GCSContentsManager.project="{{ your-project }}"c.GCSContentsManager.token="~/.config/gcloud/application_default_credentials.json"c.GCSContentsManager.bucket="{{ GCP bucket name }}"
请注意,文件~/.config/gcloud/application_default_credentials.json
采用posix系统
当你这样做时gcloud init
带SAS的Azure blob存储
导入操作系统
从s3contents导入BlobContentsManager 来自s3目录.blobmanager导入身份验证共享访问签名 令牌=身份验证共享访问签名( 帐户密钥=操作系统环境[“AZURE账户密钥”], 帐户名=操作系统环境[“AZURE_帐户名”] ) c。目录管理器\Noteu=BlobContentsManager c。BlobContentsManager.credential=令牌 c。BlobContensManager.account_key=操作系统.environ[“AZURE帐户密钥”] c。BlobContentsManager.container_name=os.environ[“AZURE_CONTAINER_名称”] c。BlobContentsManager.account_name=os.environ[“AZURE_帐户名”]
访问本地文件
要访问S3中的本地文件和远程文件,可以使用hybridcontents。在
首先安装:
pip install hybridcontents
使用与此类似的配置:
froms3contentsimportS3ContentsManagerfromhybridcontentsimportHybridContentsManagerfromIPython.html.services.contents.filemanagerimportFileContentsManagerc=get_config()c.NotebookApp.contents_manager_class=HybridContentsManagerc.HybridContentsManager.manager_classes={# Associate the root directory with an S3ContentsManager.# This manager will receive all requests that don"t fall under any of the# other managers."":S3ContentsManager,# Associate /directory with a FileContentsManager."local_directory":FileContentsManager,}c.HybridContentsManager.manager_kwargs={# Args for root S3ContentsManager."":{"access_key_id":"{{ AWS Access Key ID / IAM Access Key ID }}","secret_access_key":"{{ AWS Secret Access Key / IAM Secret Access Key }}","bucket":"{{ S3 bucket name }}",},# Args for the FileContentsManager mapped to /directory"local_directory":{"root_dir":"/Users/danielfrg/Downloads",},}
- 项目
标签: