Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
in Big Data Hadoop & Spark by (6.5k points)
It takes a rather inconvenient amount of time to execute and start all the services (start hdfs etc) before working on my cluster every time I log in. Is there a better way for this?

1 Answer

0 votes
by (11.3k points)

The shortest way for this is through a script already in-built into Hadoop Ecosystems if the paths are set correctly in the environment variables. Run the following command:   

This will start all the Hadoop ecosystem services, ideally. 

These are some of the most basic shortcuts to use to implement a faster Hadoop environment for faster learning and practices. I found a great big data course that really got me comfortable in this technology at Intellipaat. 

Browse Categories