I'm trying to create a Lambda function that is triggered when a new (JSON array) object is uploaded into an AWS S3 bucket:
- Reads the object from S3 (when triggered)
- Splits the JSON array into separate JSON objects
- Writes each (split) JSON object as a new object in a different S3 bucket
I'm not really sure how to start. I have my two buckets (and policies) defined already and the following code:
import json
import urllib
import boto3
def lambda_handler(event, context):
# Get payload from S3 event
filepath = resolve_filepath(event)
source = s3.Object(dev-espc-ppmachine-raw, filepath)
payload = source.get()['Body'].read()
# Processing - split array into separate JSON objects (need code)
# Write to "staged" S3 bucket (need code)
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}