在AWS Lambda脚本中使用多个Python函数

2024-05-17 19:46:16 发布

您现在位置:Python中文网/ 问答频道 /正文

形势

我正在使用一个Lambda函数,它从传入的电子邮件中获取一个CSV附件,并将其放入实际上是S3 bucket的子文件夹中。Lambda的这一部分工作得很好,但是我需要在同一Lambda函数中执行其他udf,以执行后续任务

代码

    import boto3 
    
    import email
    import base64
    
    import math
    import pickle
    
    import numpy as np
    import pandas as pd
    
    import io 
    
    
    ###############################
    ###    GET THE ATTACHMENT   ###
    ###############################
    
    #s3 = boto3.client('s3')
    
    
    FILE_MIMETYPE = 'text/csv'
    #'application/octet-stream'
    
    # destination folder
    S3_OUTPUT_BUCKETNAME = 'my_bucket' 
    
    print('Loading function')
    
    s3 = boto3.client('s3')
    
    
    def lambda_handler(event, context):
    
        #source email bucket 
        inBucket = event['Records'][0]['s3']['bucket']['name']
        key = urllib.parse.quote(event['Records'][0]['s3']['object']['key'].encode('utf8'))
    
    
        try:
            response = s3.get_object(Bucket=inBucket, Key=key)
            msg = email.message_from_string(response['Body'].read().decode('utf-8'))   
    
        except Exception as e:
            print(e)
            print('Error retrieving object {} from source bucket {}. Verify existence and ensure bucket is in same region as function.'.format(key, inBucket))
            raise e
        
    
        attachment_list = []
       
    
        try:
            #scan each part of email 
            for message in msg.walk():
                
                # Check filename and email MIME type
                if  (message.get_content_type() == FILE_MIMETYPE and message.get_filename() != None):
                    attachment_list.append ({'original_msg_key':key, 'attachment_filename':message.get_filename(), 'body': base64.b64decode(message.get_payload()) })
        except Exception as e:
            print(e)
            print ('Error processing email for CSV attachments')
            raise e
        
        # if multiple attachments send all to bucket 
        for attachment in attachment_list:
    
            try:
                s3.put_object(Bucket=S3_OUTPUT_BUCKETNAME, Key='attachments/' + attachment['original_msg_key'] + '-' + attachment['attachment_filename'] , Body=attachment['body']
            )
            except Exception as e:
                print(e)
                print ('Error sending object {} to destination bucket {}. Verify existence and ensure bucket is in same region as function.'.format(attachment['attachment_filename'], S3_OUTPUT_BUCKETNAME))
                raise e

#################################
###    ADDITIONAL FUNCTIONS   ###
#################################
    
    def my_function():
      print("Hello, this is another function")

结果

CSV附件已成功检索并放置在s3.put_对象指定的目标中,但是Cloudwatch日志中没有证据表明my_函数正在运行

我尝试过的

我尝试使用def my_函数(事件、上下文):试图确定函数是否需要与第一个函数相同的条件来执行。我还尝试将my_函数()作为第一个函数的一部分,但这似乎也不起作用

如何确保这两个函数都在Lambda中执行


Tags: key函数importmessageattachmentgetobjects3
1条回答
网友
1楼 · 发布于 2024-05-17 19:46:16

根据评论

导致此问题的原因是在lambda处理程序中未调用my_function函数

解决方案是将my_function()添加到处理程序lambda_handler中,以便实际调用my_function

相关问题 更多 >