第一个hadoop配置进程
软件环境:
OS:RHEL 6
JDK:openjdk
eclipse:kepler
1.安装JDK
1.配置环境变量,打开/etc/profile,添加下列代码:
export JAVA_HOME=/usr/lib/jvm/java-openjdk
export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$PATH:$JAVA_HOME/bin
2.安装hadoop
1.下载hadoop-2.2.0.tar.gz,解压到/hadoop
2.配置环境变量,打开/etc/environment,添加下列代码:
export HADOOP_HOME=/hadoop
export PATH=/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/bin:/bin:/root/bin:/hadoop/bin:/hadoop/sbin
3.写入两批环境变量
# source /etc/profile
# source /etc/environment
4.查看是否安装hadoop成功
# hadoop version
5.安装hadoop-eclipse插件
1.下载hadoop-eclipse-kepler-plugin-2.2.0.jar (必须和安装的hadoop是同一版本)
2.拷贝到kepler-eclipse的pliugins文件夹下
6.修改hadoop下的配置文件
7.配置SSH的localhost免密码登录
# ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
# cat ~/.ssh/id_rsa.pub >> ~/.ssh/autorized_keys
8.格式化hdfs文件系统
# hadoop namenode -format
9.启动守护进程
# start-all.sh
(关闭守护进程:# stop-all.sh)
10.配置eclipse里面的DFS Location
11.新建map/reduce项目
...