1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
|
Update the directory of master nodes of Hadoop cluster
## To edit file, fire the below given command
hduser@HadoopMaster:/usr/local/hadoop/etc/hadoop$ sudo gedit masters
## Add name of master nodes
HadoopMaster
Update slaves
Update the directory of slave nodes of Hadoop cluster
## To edit file, fire the below given command
hduser@HadoopMaster:/usr/local/hadoop/etc/hadoop$ sudo gedit slaves
## Add name of slave nodes
HadoopSlave1
HadoopSlave2
Copying/Sharing/Distributing Hadoop config files to rest all nodes – master/slaves
Use rsync for distributing configured Hadoop source among rest of nodes via network.
# In HadoopSlave1 machine
sudo rsync -avxP /usr/local/hadoop/ hduser@HadoopSlave1:/usr/local/hadoop/
# In HadoopSlave2 machine
sudo rsync -avxP /usr/local/hadoop/ hduser@HadoopSlave2:/usr/local/hadoop/
The above command will share the files stored within hadoop folder to Slave nodes with location – /usr/local/hadoop. So, you dont need to again download as well as setup the above configuration in rest of all nodes. You just need Java and rsync to be installed over all nodes. And this JAVA_HOME path need to be matched with $HADOOP_HOME/etc/hadoop/hadoop-env.sh file of your Hadoop distribution which we had already configured in Single node Hadoop configuration. |
Partager