Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
4 views
in AWS by (19.1k points)

I have an Elastic Map Reduce job which is writing some files in S3 and I want to concatenate all the files to produce a unique text file.

Currently, I'm manually copying the folder with all the files to our HDFS (hadoop fs copyFromLocal), then I'm running hadoop fs -getmerge and hadoop fs copyToLocal to obtain the file.

is there any way to use Hadoop fs directly on S3?

1 Answer

0 votes
by (44.4k points)

getmerge will expect a local destination so it will not work with S3.

Usage:

hadoop fs [generic options] -getmerge [-nl] <src> <localdst>

Related questions

0 votes
1 answer
Want to get 50% Hike on your Salary?

Learn how we helped 50,000+ professionals like you !

0 votes
1 answer
0 votes
1 answer
asked Jul 31, 2019 in AWS by yuvraj (19.1k points)

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...