Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Python by (120 points)

I'm trying to create a Lambda function that is triggered when a new (JSON array) object is uploaded into an AWS S3 bucket:

  1. Reads the object from S3 (when triggered)
  2. Splits the JSON array into separate JSON objects
  3. Writes each (split) JSON object as a new object in a different S3 bucket

I'm not really sure how to start. I have my two buckets (and policies) defined already and the following code:

import json

import urllib

import boto3

def lambda_handler(event, context):

    # Get payload from S3 event

    filepath = resolve_filepath(event)

    source = s3.Object(dev-espc-ppmachine-raw, filepath)

    payload = source.get()['Body'].read()

    # Processing - split array into separate JSON objects (need code)

     # Write to "staged" S3 bucket (need code)

    return {

        'statusCode': 200,

        'body': json.dumps('Hello from Lambda!')

    }

Please log in or register to answer this question.

Related questions

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer
asked Jul 15, 2019 in Python by Sammy (47.6k points)
0 votes
1 answer
asked Jul 5, 2019 in Python by Sammy (47.6k points)

Browse Categories

...