Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AWS by (19.1k points)

Here is what I want to do :

  • User uploads a CSV file onto AWS S3 bucket.
  • Upon file uploaded, S3 bucket invokes the lambda function that I have created.
  • My Lambda function reads CSV file content, then send an email with the file content and info

Local environment

Serverless framework version 1.22.0 

Python 2.7 

Here is my serverless.yml file

service: aws-python # NOTE: update this with your service name

 

provider:

  name: aws

  runtime: python2.7

  stage: dev

  region: us-east-1

  iamRoleStatements:

        - Effect: "Allow"

          Action:

              - s3:*

              - "ses:SendEmail"

              - "ses:SendRawEmail"

              - "s3:PutBucketNotification"

          Resource: "*"

 

functions:

  csvfile:

    handler: handler.csvfile

    description: send mail whenever a csv file is uploaded on S3 

    events:

      - s3:

          bucket: mine2

          event: s3:ObjectCreated:*

          rules:

            - suffix: .csv

and here is my lambda function :

import json

import boto3

import botocore

import logging

import sys

import traceback

import csv

 

from botocore.exceptions import ClientError

from pprint import pprint

from time import strftime, gmtime

from json import dumps, loads, JSONEncoder, JSONDecoder

 

 

#setup simple logging for INFO

logger = logging.getLogger()

logger.setLevel(logging.INFO)

 

from botocore.exceptions import ClientError

 

def csvfile(event, context):

    """Send email whenever a csvfile is uploaded to S3 """

    body = {}

    emailcontent = ''

    status_code = 200

    #set email information

    email_from = '****@*****.com'

    email_to = '****@****.com'

    email_subject = 'new file is uploaded'

    try:

        s3 = boto3.resource(u's3')

        s3 = boto3.client('s3')

        for record in event['Records']:

            filename = record['s3']['object']['key']

            filesize = record['s3']['object']['size']

            source = record['requestParameters']['sourceIPAddress']

            eventTime = record['eventTime']

        # get a handle on the bucket that holds your file

        bucket = s3.Bucket(u'mine2')

        # get a handle on the object you want (i.e. your file)

        obj = bucket.Object(key= event[u'Records'][0][u's3'][u'object'][u'key'])

        # get the object

        response = obj.get()

        # read the contents of the file and split it into a list of lines

        lines = response[u'Body'].read().split()

        # now iterate over those lines

        for row in csv.DictReader(lines):    

            print(row)

            emailcontent = emailcontent + '\n' + row 

    except Exception as e:

        print(traceback.format_exc())

        status_code = 500

        body["message"] = json.dumps(e)

 

    email_body = "File Name: " + filename + "\n" + "File Size: " + str(filesize) + "\n" +  "Upload Time: " + eventTime + "\n" + "User Details: " + source + "\n" + "content of the csv file :" + emailcontent

    ses = boto3.client('ses')

    ses.send_email(Source = email_from,

        Destination = {'ToAddresses': [email_to,],}, 

            Message = {'Subject': {'Data': email_subject}, 'Body':{'Text' : {'Data': email_body}}}

            )

    print('Function execution Completed')

I don't know what I did wrong, cause the part when i just get info about the file works fine, it's when i add the reading part that the lambda function doesn't return anything

1 Answer

0 votes
by (44.4k points)

Add CloudWatch access to your IAM policy. You can use the logger.ingo(message) instead just printing it out.

This will work:

import logging

import boto3

 

logger = logging.getLogger()

logger.setLevel(logging.INFO)

 

s3 = boto3.client('s3')

 

def lambda_handler(event, context):

    email_content = ''

 

    # retrieve bucket name and file_key from the S3 event

    bucket_name = event['Records'][0]['s3']['bucket']['name']

    file_key = event['Records'][0]['s3']['object']['key']

    logger.info('Reading {} from {}'.format(file_key, bucket_name))

    # get the object

    obj = s3.get_object(Bucket=bucket_name, Key=file_key)

    # get lines inside the csv

    lines = obj['Body'].read().split(b'\n')

    for r in lines:

       logger.info(r.decode())

       email_content = email_content + '\n' + r.decode()

    logger.info(email_content)

Related questions

Want to get 50% Hike on your Salary?

Learn how we helped 50,000+ professionals like you !

0 votes
1 answer

Browse Categories

...