如何使用Python验证SSL证书文件(.crt)?

2024-10-04 05:21:48 发布

您现在位置:Python中文网/ 问答频道 /正文

基本上,我曾经有一个进程,将.csv文件从Windows中的网络共享驱动器加载到云位置(谷歌云存储)。I.T.实施的网络发生了一些变化,中断了此作业,引发了如下错误:

HTTPSConnectionPool(host='storage.googleapis.com', port=443): Max retries exceeded with url: /storage/v1/b/bucket_cbsm?projection=noAcl (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1045)')))

我正在使用Jupyter笔记本/Python编写和测试我的脚本。基本上,I.T.给了我一个.crt文件,他们说是SSL证书,在Google SDK外壳中,我已经确认我可以将gcloud命令指向该文件并进行网络诊断,结果成功。问题是,我现在必须以某种方式将Python脚本指向该.crt文件位置,以便验证SSL,以便脚本的其余部分能够成功运行。我知道Python有一个ssl库,但我想知道是否有人知道如何使用它来验证我在本地存储的.crt SSL证书

from google.cloud import storage
from google.oauth2 import service_account
import os

proj_id = 'test-project'
client = storage.Client(
                    #credentials=credentials,
                    project=proj_id #credentials.project_id
)

def uploadGCS(mybucket,myfile,filename,foldname):
    #mybucket = name of the GCS bucket
    #myfile = filepath for the specific CSV you want to upload to GCS
    #filename = name of the file that you uploaded (renames the uploaded file)
    bucket = client.get_bucket(mybucket)
    blob = bucket.blob(myfile)
    blob.upload_from_filename(myfile)
    bucket.rename_blob(blob, foldname+filename) #add "folderName/" before filename to indicate which folder to drop file in.

#intention below is to load the file for storage purposes (naming convention by date) into a GCS folder, and load the 
mybucket = 'bucket_cbsm'
foldname = 'daily ftp load/archives/'
loadfoldname = 'daily ftp load/current day/'
cadftp_path = csvoutput_path+'cadftpoutput'+curr_dtxl_str+'.csv'
usdftp_path = csvoutput_path+'usdftpoutput'+curr_dtxl_str+'.csv'

fixedbase_path = csvoutput_path+'fixedbase_'+curr_dtxl_str+'.csv'
prev_fixedbase_path =  csvoutput_path+'cadfixedprev_'+curr_dtxl_str+'.csv'
aa_path =csvoutput_path+'aacurr_'+curr_dtxl_str+'.csv'
csfs_path =csvoutput_path+'csfs_'+curr_dtxl_str+'.csv'
genrates_path = csvoutput_path+'genrates_'+curr_dtxl_str+'.csv'

cadftp_load_path = csvoutput_path+'cadftpoutput'+curr_dtxl_str+'.csv'
usdftp_load_path = csvoutput_path+'usdftpoutput'+curr_dtxl_str+'.csv'

cadftp_name = 'cadftpoutput'+curr_dtxl_str
usdftp_name = 'usdftpoutput'+curr_dtxl_str
cadftp_loadname = 'cadftpoutput'
usdftp_loadname = 'usdftpoutput'
fixedbase_name = 'fixedbase_'+curr_dtxl_str
prev_fixedbase_name =  'cadfixedprev_'+curr_dtxl_str
aa_name = 'aacurr_'+curr_dtxl_str
csfs_name = 'csfs_'+curr_dtxl_str
genrates_name = 'genrates_'+curr_dtxl_str



uploadGCS(mybucket,cadftp_path,cadftp_name,foldname) #uploads current day FTP file into archive storage
uploadGCS(mybucket,usdftp_path,usdftp_name,foldname) #uploads current day FTP file into archive storage
uploadGCS(mybucket,fixedbase_path,fixedbase_name,foldname) #uploads current day fixed base into archive storage
uploadGCS(mybucket,prev_fixedbase_path,prev_fixedbase_name,foldname) #uploads previous day FTP file into archive storage
uploadGCS(mybucket,aa_path,aa_name,foldname) #uploads current day FTP file into archive storage
uploadGCS(mybucket,csfs_path,csfs_name,foldname) #uploads current day FTP file into archive storage
uploadGCS(mybucket,genrates_path,genrates_name,foldname) #uploads current day FTP file into archive storage

uploadGCS(mybucket,cadftp_path,cadftp_loadname,loadfoldname) #uploads current day FTP file into the loading folder for BQ 
uploadGCS(mybucket,usdftp_path,usdftp_loadname,loadfoldname) #uploads current day FTP file into the loading folder for BQ 

Tags: csvpathnamestoragecurrentfiledaystr