Hadoop Installation - Nantawat6510545543/big-data-summary GitHub Wiki

Hadoop Setup (on vm)

1. Install Java

sudo apt update
sudo apt install openjdk-8-jdk

2. Create Hadoop User

sudo adduser hadoop
sudo usermod -aG sudo hadoop
sudo passwd hadoop

3. SSH Key Setup (as hadoop user)

Login as the hadoop user:

ssh hadoop@<your-vm-ip>

Generate and configure key-based SSH:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys

4. Download and Extract Hadoop

wget https://archive.apache.org/dist/hadoop/common/hadoop-3.2.1/hadoop-3.2.1.tar.gz
tar xzf hadoop-3.2.1.tar.gz
mv hadoop-3.2.1 hadoop

5. Set Environment Variables

Edit .bashrc:

nano ~/.bashrc

Add to the end:

export HADOOP_HOME=/home/hadoop/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native

Apply the changes:

source ~/.bashrc

6. Set JAVA_HOME in Hadoop

Edit hadoop-env.sh:

nano $HADOOP_HOME/etc/hadoop/hadoop-env.sh

Add:

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

7. Configure XML Files

core-site.xml

nano $HADOOP_HOME/etc/hadoop/core-site.xml

Paste on configuration:

  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:9000</value>
  </property>

hdfs-site.xml

nano $HADOOP_HOME/etc/hadoop/hdfs-site.xml

Paste on configuration:

  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
  <property>
    <name>dfs.name.dir</name>
    <value>file:///home/hadoop/hadoopdata/hdfs/namenode</value>
  </property>
  <property>
    <name>dfs.data.dir</name>
    <value>file:///home/hadoop/hadoopdata/hdfs/datanode</value>
  </property>

mapred-site.xml

nano $HADOOP_HOME/etc/hadoop/mapred-site.xml

Paste on configuration:

  <property>
    <name>mapreduce.framework.name</name>
    <value>yarn</value>
  </property>

yarn-site.xml

nano $HADOOP_HOME/etc/hadoop/yarn-site.xml

Paste on configuration:

  <property>
    <name>yarn.nodemanager.aux-services</name>
    <value>mapreduce_shuffle</value>
  </property>

8. Format NameNode

hdfs namenode -format

9. Start Hadoop Services

cd $HADOOP_HOME/sbin/
./start-all.sh
jps

10. Access Web Interfaces

  • NameNode: http://<yourip>:9870
  • ResourceManager: http://<yourip>:8088
⚠️ **GitHub.com Fallback** ⚠️