0 votes
1 view
in Big Data Hadoop & Spark by (6.1k points)
It takes a rather inconvenient amount of time to execute and start all the services (start hdfs etc) before working on my cluster every time I log in. Is there a better way for this?

1 Answer

0 votes
by (11.3k points)

The shortest way for this is through a script already in-built into Hadoop Ecosystems if the paths are set correctly in the environment variables. Run the following command:

 start-all.sh   

This will start all the Hadoop ecosystem services, ideally. 

These are some of the most basic shortcuts to use to implement a faster Hadoop environment for faster learning and practices. I found a great big data course that really got me comfortable in this technology at Intellipaat. 

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...