Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (55.6k points)

a. Flume

b. Sqoop

c. both of the above

d. None of the above

1 Answer

0 votes
by (119k points)

The correct answer is option B (Apache Sqoop). Apache Sqoop is used to import the structured data from RDBMS such as MySQL, Oracle, etc. and move to HBase, Hive, or HDFS. Apache Sqoop can also be used to move the data from HDFS to RDBMS.

If you wish to learn Hadoop from top experts, I recommend this Hadoop Certification course by Intellipaat.

You can watch this video to have an overview of HDFS:

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...