Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (6.5k points)
Is it just for effectively storing data that is present in large amounts?

1 Answer

0 votes
by (11.3k points)

Hadoop is used for more than storing Big Data, it's an implement to store AND process large amounts of data. The HDFS (Hadoop Distributed File System) is just one aspect of Hadoop. Hadoop also helps in using the HDFS to work on data in parallel and replicate it for fail-safe measures in case anything goes wrong with the data or the node. Here are some of the use cases of Hadoop in the real world:

Financial Sector:

Morgan Stanley uses Hadoop for key financial assessments. The amount of scalability that Hadoop provides is unseen and the the fact that it runs on commodity hardware makes it one of the most valuable tools in big data analytics. Considering the amount of data these organisations produce, which could be upwards of petabytes, these tools become imperative as well. Similar examples include JP Morgan and Chase Co. 

Healthcare Analytics:

This sector also generates enormous amounts of data. Rather than using individual cases to determine the ailments of each person, we can use the enormous backlog that has already been generated to expedite and make the process of understanding the person's ailment more cost efficient and accurate. Another aspect of this is the Health Insurance companies which analyse these data-sets to understand their risks and rewards.

Similar examples of such industries are:

-Telecom Industry- Usage analytics, data and network usage.

-Retail- Customer Analytics to understand which product sells more.

If you're looking to learn Hadoop via some projects taught by industry professionals, you can take up a good big data course

Browse Categories

...