Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (6.5k points)
It takes a rather inconvenient amount of time to execute and start all the services (start hdfs etc) before working on my cluster every time I log in. Is there a better way for this?

1 Answer

0 votes
by (11.3k points)

The shortest way for this is through a script already in-built into Hadoop Ecosystems if the paths are set correctly in the environment variables. Run the following command:

 start-all.sh   

This will start all the Hadoop ecosystem services, ideally. 

These are some of the most basic shortcuts to use to implement a faster Hadoop environment for faster learning and practices. I found a great big data course that really got me comfortable in this technology at Intellipaat. 

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

30.5k questions

32.6k answers

500 comments

108k users

Browse Categories

...