【Linux下Hadoop-eclipse-plus-3.2.0】编译Hadoop连接eclipse的插件遇见的一系列错误,崩溃的操作

2019-09-02 23:35:22

前言:首先,我想吐槽下自己,居然花费了4到5个夜晚和中午的时间来做这件事情,直到刚才才顺利解决,我也挺佩服自己的!

我在这个过程中参考其他人的博客,非常感谢他们,写博客是个很好的习惯,警戒自己也帮助他人!

正文:

1、你需要下载3种必备品:Hadoop、eclipse、Hadoop-eclipse-plus

2、Hadoop、eclipse都是直接从官网下载,如果说从官网下载慢的话,可以用迅雷下载,这样快点。Hadoop-eclipse-plus需要从GitHub上搜索进行下载,最好下载Hadoop3.x-eclipse-plus

3.安装好Hadoop、eclipse后,进入本片主要讲解操作,编译Hadoop-eclipse-plus-3.x.x。因为我都是下载最新的版本,所以Hadoop用的3.2.0版本,故在本文中我自己编译的是Hadoop-eclipse-plus-3.2.0版本。

4.开始第一步解压Hadoop3.x-eclipse-plus压缩包到/home/hadoop/hadoop-3.2.0下,会形成eclipse-hadoop3x-master目录,接下来我们所有的操作都是在这个目录下进行的。

5.修改该目录下/src/contrib/eclipse-plugin中的build.xml文件,代码中出现add的地方就是我修改的地方

 <!----------------------- part one--------------------------------------->

 <!-- Override classpath to include Eclipse SDK jars -->
  <path id="classpath">
    <pathelement location="${build.classes}"/>
    <!--pathelement location="${hadoop.root}/build/classes"/-->
    <path refid="eclipse-sdk-jars"/>
    <path refid="hadoop-sdk-jars"/>

 <!-- new add start-->
<fileset dir="${hadoop.root}">
    <include name="**/*.jar" />
</fileset>
 <!-- new add end-->

 <!----------------------- part two--------------------------------------->
<!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->
    <echo message="contrib: ${name}"/>
    <javac
     encoding="${build.encoding}"
     srcdir="${src.dir}"
     includes="**/*.java"
     destdir="${build.classes}"
     debug="${javac.debug}"
     deprecation="${javac.deprecation}"
     includeantruntime="on" <!-- new add --> >
     <classpath refid="classpath"/>
    </javac>
  </target>

 <!----------------------- part three---------------------------------------
注:这部分需要根据自己的需要添加,并且需要和已安装的Hadoop里面的包版本对应一致,且要在libraries.properties中对的上>
<copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration2-${commons-configuration2.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang3-${commons-lang3.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/javax.servlet-api-${servlet-api.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core4-${htrace.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/woodstox-core-5.0.3.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/stax2-api-3.1.4.jar" todir="${build.dir}/lib" verbose="true"/> <jar jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar" manifest="${root}/META-INF/MANIFEST.MF"> <manifest> <attribute name="Bundle-ClassPath" value="classes/, lib/hadoop-mapreduce-client-core-${hadoop.version}.jar, lib/hadoop-mapreduce-client-common-${hadoop.version}.jar, lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar, lib/hadoop-auth-${hadoop.version}.jar, lib/hadoop-common-${hadoop.version}.jar, lib/hadoop-hdfs-${hadoop.version}.jar, lib/protobuf-java-${protobuf.version}.jar, lib/log4j-${log4j.version}.jar, lib/commons-cli-${commons-cli.version}.jar, lib/commons-configuration2-${commons-configuration2.version}.jar, lib/commons-httpclient-${commons-httpclient.version}.jar, lib/commons-lang3-${commons-lang3.version}.jar, lib/commons-collections-${commons-collections.version}.jar, lib/jackson-core-asl-${jackson.version}.jar, lib/jackson-mapper-asl-${jackson.version}.jar, lib/slf4j-log4j12-${slf4j-log4j12.version}.jar, lib/slf4j-api-${slf4j-api.version}.jar, lib/guava-${guava.version}.jar, lib/hadoop-auth-${hadoop.version}.jar, lib/netty-${netty.version}.jar, lib/javax.servlet-api-${servlet-api.version}.jar, lib/commons-io-${commons-io.version}.jar, lib/woodstox-core-5.0.3.jar, lib/stax2-api-3.1.4.jar"/>

这里需要注意几个地方:

1.第三部分中加入的woodstox-core-5.0.3.jarstax2-api-3.1.4.jar是为了解决“安装完成后启动Eclipse,点击Add New Location没有反应”的问题

2.第三部分加入的lib下的包必须和自己安装Hadoop里面的版本一致,这里需要自己一个个去核对,具体在下一步操作中就会遇见,非常费时的一个操作

3.上面代码中忘记加了,将depends="init, ivy-retrieve-common"去掉,我看别人是这样解释的“执行上面的命令的时候,会查找相应的jar包,若没有的会一直停在那里,但实际上编译Hadoop-eclipse-plugin并不需要common相关的包.

<target name="compile"  unless="skip.contrib">
<!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->

4.另外要注意build.xlm中涉及到Hadoop安装目录下/share/hadoop/common/lib/路径时,要观察与Hadoop的路径是否一致,如果不是要修改成一样的,不然无法正常引用jar包。

6、修改该目录下/ivy中的libraries.properties文件,要求和Hadoop安装目录下share/hadoop中lib里面jar包的版本能对的上就行,遇见找不到的就不修改

以下是我修改后的代码:

#   Licensed under the Apache License, Version 2.0 (the "License");
#   you may not use this file except in compliance with the License.
#   You may obtain a copy of the License at
#
#       http://www.apache.org/licenses/LICENSE-2.0
#
#   Unless required by applicable law or agreed to in writing, software
#   distributed under the License is distributed on an "AS IS" BASIS,
#   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#   See the License for the specific language governing permissions and
#   limitations under the License.

#This properties file lists the versions of the various artifacts used by hadoop and components.
#It drives ivy and the generation of a maven POM

# This is the version of hadoop we are generating
hadoop.version=3.2.0 
hadoop-gpl-compression.version=0.1.0

#These are the versions of our dependencies (in alphabetical order)
apacheant.version=1.7.0
ant-task.version=2.0.10

asm.version=5.0.4 
aspectj.version=1.6.5
aspectj.version=1.6.11

checkstyle.version=4.2

commons-cli.version=1.2
commons-codec.version=1.11
commons-collections.version=3.2.2
commons-configuration2.version=2.1.1
commons-daemon.version=1.0.13
commons-httpclient.version=3.0.1
commons-lang3.version=3.7
commons-logging.version=1.1.13
commons-logging-api.version=1.1.13
commons-math.version=3.1.1
commons-el.version=1.0
commons-fileupload.version=1.2
commons-io.version=2.5
commons-net.version=3.6
core.version=3.1.1
coreplugin.version=1.3.2

hsqldb.version=2.3.4

ivy.version=2.1.0

jasper.version=5.5.12
jackson.version=1.9.13
#not able to figureout the version of jsp & jsp-api version to get it resolved throught ivy
# but still declared here as we are going to have a local copy from the lib folder
jsp.version=2.1
jsp-api.version=5.5.12
jsp-api-2.1.version=6.1.14
jsp-2.1.version=6.1.14
jets3t.version=0.6.1
jetty.version=6.1.26
jetty-util.version=6.1.26
jersey-core.version=1.19
jersey-json.version=1.19
jersey-server.version=1.19
junit.version=4.11
jdeb.version=0.8
jdiff.version=1.0.9
json.version=1.0
protobuf.version=2.5.0

kfs.version=0.1
netty.version=3.10.5.Final
log4j.version=1.2.17
lucene-core.version=2.3.1
htrace.version=4.1.0-incubating
mockito-all.version=1.8.5
jsch.version=0.1.45

oro.version=2.0.8

rats-lib.version=0.5.1

servlet.version=4.0.6
servlet-api.version=3.1.0
slf4j-api.version=1.7.25
slf4j-log4j12.version=1.7.25
guava.version=11.0.2

wagon-http.version=1.0-beta-2
xmlenc.version=0.52
xerces.version=1.4.4

7.执行编译指令,具体可参考下载下来插件解压后的eclipse-hadoop3x-master文件下的README.md参考说明书,代码如下

[hadoop@hadoop01 ~]$ cd /home/hadoop/hadoop-3.2.0/eclipse-hadoop3x-master/src/contrib/eclipse-plugin
--要进入root权限进行编译,不然会报错
[hadoop@hadoop01 eclipse-plugin]$ su root
Password:
[root@hadoop01 eclipse-plugin]# ant jar -Dversion=3.2.0 -Dhadoop.version=3.2.0 -Declipse.home=/home/hadoop/eclipse -Dhadoop.home=/home/hadoop/hadoop-3.2.0
Buildfile: /home/hadoop/hadoop-3.2.0/eclipse-hadoop3x-master/src/contrib/eclipse-plugin/build.xml

compile:
     [echo] contrib: eclipse-plugin

jar:
      [jar] Building jar: /home/hadoop/hadoop-3.2.0/eclipse-hadoop3x-master/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-3.2.0.jar

BUILD SUCCESSFUL
Total time: 5 seconds

注意几点:一是先将README.md读懂,知道如何去编译Hadoop-eclipse-plus插件;而是需要使用root权限,不然会报错,无法编译成功,昨天其实我应该设置好所有的参数了,就是没有使用root权限,所以没有编译成功。

吐槽下:其实自己编译成功,结果eclipse的版本(我下载的是c++)下载错误,导致在eclipse中一直无法出现Hadoop Map/Reduce选项卡,要下载eclipse-jee-linux-gtk-x86_64.tar.gz版本。

8、将hadoop-eclipse-plugin-2.x.x.jar放到eclipse的plugins目录下,启动eclipse

9、打开window===>prefernces, 出现Hadoop Map/Reduce选项卡就说明编译成功了

10、但是到上步就完事大吉了,你还需要设置Hadoop Map/Reduce的里面Hadoop的安装地址,然后再进行后续操作才能确保编译正确,不然要返回去修改参数,重新编译。

【new error】好吧,发表这篇文章之后,这几天我又陆续在整这个插件问题,因为我以为可以用了,实际也可以用了,但是在DFS location下的文件夹一直点不开,最开始报eclipse org/apache/htrace/core/Tracer$Builder这个错误,中间折腾了很久,不知道怎么在弄,最后报No FileSystem for scheme: hdfs这个错误,我一直以为是我编译的插件有问题,其实不然。

然后在网上各种查找,终于在"https://blog.csdn.net/junior19/article/details/79594192"这篇文章中按照它的步骤找到了解决方法,反正现在是能看见文件夹了。哎。。。。。。。。。。。。

操作是在edit hadoop location 里面的advanced parameter的参数设置里面找到hadoop.tmp.dir将其内容填成与你之前配置hadoop文件的core-site.xml的要一致

其他出现的问题,请参考下面的地址,这都是我在编译过程中参考的文章,再次感谢他们!

【文章1】https://www.cnblogs.com/ljy2013/articles/4417933.html

【文章2】https://www.cnblogs.com/zhangchao0515/p/7099002.html

【文章3】https://www.cnblogs.com/PurpleDream/p/4014751.html

【文章4】https://www.cnblogs.com/sissie-coding/p/9449941.html

附build.xml全代码:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at
       http://www.apache.org/licenses/LICENSE-2.0
   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<project default="jar" name="eclipse-plugin">

  <import file="../build-contrib.xml"/>

  <path id="eclipse-sdk-jars">

    <fileset dir="${eclipse.home}/plugins/">
      <include name="org.eclipse.ui*.jar"/>
      <include name="org.eclipse.jdt*.jar"/>
      <include name="org.eclipse.core*.jar"/>
      <include name="org.eclipse.equinox*.jar"/>
      <include name="org.eclipse.debug*.jar"/>
      <include name="org.eclipse.osgi*.jar"/>
      <include name="org.eclipse.swt*.jar"/>
      <include name="org.eclipse.jface*.jar"/>
      <include name="org.eclipse.team.cvs.ssh2*.jar"/>
      <include name="com.jcraft.jsch*.jar"/>
    </fileset> 
  </path>

  <path id="hadoop-sdk-jars">
    <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
      <include name="hadoop*.jar"/>
    </fileset> 
    <fileset dir="${hadoop.home}/share/hadoop/hdfs">
      <include name="hadoop*.jar"/>
    </fileset> 
    <fileset dir="${hadoop.home}/share/hadoop/common">
      <include name="hadoop*.jar"/>
    </fileset> 
  </path>



  <!-- Override classpath to include Eclipse SDK jars -->
  <path id="classpath">
    <pathelement location="${build.classes}"/>
    <!--pathelement location="${hadoop.root}/build/classes"/-->
    <path refid="eclipse-sdk-jars"/>
    <path refid="hadoop-sdk-jars"/>

 <!-- new add -->
<fileset dir="${hadoop.root}">
    <include name="**/*.jar" />
</fileset>

  </path>

  <!-- Skip building if eclipse.home is unset. -->
  <target name="check-contrib" unless="eclipse.home">
    <property name="skip.contrib" value="yes"/>
    <echo message="eclipse.home unset: skipping eclipse plugin"/>
  </target>

 <target name="compile"  unless="skip.contrib"> 
<!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->
    <echo message="contrib: ${name}"/>
    <javac
     encoding="${build.encoding}"
     srcdir="${src.dir}"
     includes="**/*.java"
     destdir="${build.classes}"
     debug="${javac.debug}"
     deprecation="${javac.deprecation}"
     includeantruntime="on">
     <classpath refid="classpath"/>
    </javac>
  </target>

  <!-- Override jar target to specify manifest -->
  <target name="jar" depends="compile" unless="skip.contrib">
    <mkdir dir="${build.dir}/lib"/>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/common">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/hdfs">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/yarn">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>

    <copy  todir="${build.dir}/classes" verbose="true">
          <fileset dir="${root}/src/java">
           <include name="*.xml"/>
          </fileset>
    </copy>



    <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration2-${commons-configuration2.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang3-${commons-lang3.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar"  todir="${build.dir}/lib" verbose="true"/> 
    <copy file="${hadoop.home}/share/hadoop/common/lib/javax.servlet-api-${servlet-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>  
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-io-${commons-io.version}.jar"  todir="${build.dir}/lib" verbose="true"/> 
    <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core4-${htrace.version}.jar"  todir="${build.dir}/lib" verbose="true"/> 
    <copy file="${hadoop.home}/share/hadoop/common/lib/woodstox-core-5.0.3.jar"  todir="${build.dir}/lib" verbose="true"/> 
    <copy file="${hadoop.home}/share/hadoop/common/lib/stax2-api-3.1.4.jar"  todir="${build.dir}/lib" verbose="true"/> 

    <jar
      jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar"
      manifest="${root}/META-INF/MANIFEST.MF">
      <manifest>
   <attribute name="Bundle-ClassPath" 
    value="classes/, 
 lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
 lib/hadoop-auth-${hadoop.version}.jar,
 lib/hadoop-common-${hadoop.version}.jar,
 lib/hadoop-hdfs-${hadoop.version}.jar,
 lib/protobuf-java-${protobuf.version}.jar,
 lib/log4j-${log4j.version}.jar,
 lib/commons-cli-${commons-cli.version}.jar,
 lib/commons-configuration2-${commons-configuration2.version}.jar,
 lib/commons-httpclient-${commons-httpclient.version}.jar,
 lib/commons-lang3-${commons-lang3.version}.jar,  
 lib/commons-collections-${commons-collections.version}.jar,  
 lib/jackson-core-asl-${jackson.version}.jar,
 lib/jackson-mapper-asl-${jackson.version}.jar,
 lib/slf4j-log4j12-${slf4j-log4j12.version}.jar,
 lib/slf4j-api-${slf4j-api.version}.jar,
 lib/guava-${guava.version}.jar,
 lib/hadoop-auth-${hadoop.version}.jar,
 lib/netty-${netty.version}.jar,
 lib/javax.servlet-api-${servlet-api.version}.jar,  
 lib/commons-io-${commons-io.version}.jar,  
 lib/woodstox-core-5.0.3.jar,
 lib/stax2-api-3.1.4.jar"/> 
   </manifest>
      <fileset dir="${build.dir}" includes="classes/ lib/"/>
      <!--fileset dir="${build.dir}" includes="*.xml"/-->
      <fileset dir="${root}" includes="resources/ plugin.xml"/>
    </jar>
  </target>

</project>

附README.md

hadoop3x-eclipse-plugin
=======================

eclipse plugin for hadoop 3.x.x
 

How to build
----------------------------------------

 [hdpusr@demo hadoop2x-eclipse-plugin]$ cd src/contrib/eclipse-plugin 

 # Assume hadoop installation directory is /usr/share/hadoop

 [hdpusr@apclt eclipse-plugin]$ ant jar -Dversion=3.1.0 -Dhadoop.version=3.1.0 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop

final jar will be generated at directory 

  ${hadoop2x-eclipse-plugin}/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.4.1.jar

  
release version included
-------------------------------------
 
  release/hadoop-eclipse-kepler-plugin-2.4.1.jar  # not tested yet
 
  release/hadoop-eclipse-kepler-plugin-2.2.0.jar  
  

options required
--------------------------------------
  version: plugin version
  
  hadoop.version:  hadoop version you want to compiled with

  eclipse.home: path of eclipse home 

  hadoop.home: path of hadoop 3.x home

 

How to debug
--------------------------------------
  start eclipse with debug parameter:  

    /opt/eclipse/eclipse -clean -consolelog -debug
    

Note: compile issues resolve: 
--------------------------------------
1. For different hadoop, adjust ${hadoop2x-eclipse-plugin-master}/ivy/libraries.properties, to match hadoop dependency lib version.
2. modify ${hadoop3x-eclipse}/src/contrib/eclipse-plugin/build.xml, in the node: <attribute name="Bundle-ClassPath" ....  to add the jar needed. 
3. For eclipse, do **not** use eclipse-inst, install eclipse **FULL VERSION(SDK)** instead.