Explore Courses Blog Tutorials Interview Questions
0 votes
in AWS by (12.9k points)

I have a fairly large amount of data (~30G, split into ~100 files) I'd like to transfer between S3 and EC2: when I fire up the EC2 instances I'd like to copy the data from S3 to EC2 local disks as quickly as I can, and when I'm done processing I'd like to copy the results back to S3.

I'm looking for a tool that'll do a fast / parallel copy of the data back and forth. I have several scripts hacked up, including one that does a decent job, so I'm not looking for pointers to basic libraries; I'm looking for something fast and reliable.

1 Answer

0 votes
by (18.2k points)

Have you considered using Elastic Block Store for storing your files instead of S3? You can think of EBS (Elastic Block Store) as a hard disk that you can mount on your instance. You can store your data on EBS and then unmount it from one instance and mount it on other. You will be able to access the data on EBS from the current instance to which it is mounted even though you stored the data on it using the previous instance. This will rid you of having to perform reads and writes to s3 each time you want to persist your data between EC2 instances.

Related questions

Want to get 50% Hike on your Salary?

Learn how we helped 50,000+ professionals like you !

0 votes
1 answer

Browse Categories