I'm designing an application that has an input file ranging from 1-30 GB uploaded to the S3 bucket every 15 mins. So what it does, it splits the file into "n" numbers of small files into 3 different S3 buckets in 3 different AWS regions. After that 3 loader applications read these files from the respective S3 buckets and load the data into the respective aerospike cluster.
I wish to use the AWS lambda function to split the file as well as to load the data. Also recently I came across the AWS step function which can also serve the same purpose. But I'm not sure which one to go with and which would be cheaper in terms of pricing.