0 votes
1 view
in Big Data Hadoop & Spark by (36k points)

a. Flume

b. Sqoop

c. both of the above

d. None of the above

1 Answer

0 votes
by (86.2k points)

The correct answer is option B (Apache Sqoop). Apache Sqoop is used to import the structured data from RDBMS such as MySQL, Oracle, etc. and move to HBase, Hive, or HDFS. Apache Sqoop can also be used to move the data from HDFS to RDBMS.

If you wish to learn Hadoop from top experts, I recommend this Hadoop Certification course by Intellipaat.

You can watch this video to have an overview of HDFS:

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...