在运行hadoop时出现
Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
他的大概意思就是说,库不对
后来经过查找方法,发现原来是hadoop官方提供的是32位程序,而我的系统是64位
在ubuntu32位及电脑上运行 hadoop时没有错误提示的
虽然后来我下载了64位的native,但是因为其是2.7.0版本,所以还是失败,我的hadoop版本是2.7.2
所以最后我不得重新编译hadoop
为什么需要重新编译hadoop?
需要作以下工作
第一:yum安装gcc
yum -y install gcc
yum -y install gcc gcc-c++ gdb
我先执行了第一个,后来发现还要安装gcc,就执行了第二个
第二:安装protobuf
如果没有安装,他会提示您
[WARNING] [protoc, --version] failed: java.io.IOException: Cannot run program "protoc": error=2, 没有那个文件或目录 安装编译protobuf安装方法如下:
tar jxvf protobuf-2.5.0.tar.bz2 $ ./configure $make $sudo make install $protoc --version
由于protobuf在google,现在无法下载,大家可以点击下面的连接下载,
第三:安装maven
因为hadoop是apache的项目,需要下载很多包什么的,所以需要安装maven
如何安装maven可以参考这里
CentOS7(Linux)下配置eclipse的maven和tomcat并且成功运行maven项目
第四:编译hadoop2.7.2
在hadoop的目录执行命令
mvn clean package -Pdist,native -DskipTests -Dtar
mvn clean package -Pdist,native -DskipTests -Dtar
然后就是漫长的等待
编译好的hadoop-2.7.2.tar.gz在目录下的hadoop-dist/target/的目录中。
main: [exec] $ tar cf hadoop-2.7.2.tar hadoop-2.7.2 [exec] $ gzip -f hadoop-2.7.2.tar [exec] [exec] Hadoop dist tar available at: /mysoft/maxthon/download/hadoop-2.7.2-src/hadoop-dist/target/hadoop-2.7.2.tar.gz [exec] [INFO] Executed tasks [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist --- [INFO] Building jar: /mysoft/maxthon/download/hadoop-2.7.2-src/hadoop-dist/target/hadoop-dist-2.7.2-javadoc.jar [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 4.144 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 2.190 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 5.152 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.387 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.681 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 4.314 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 4.051 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [ 5.123 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 4.351 s] [INFO] Apache Hadoop Common ............................... SUCCESS [02:22 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 11.002 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [ 18.206 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 1.541 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [03:50 min] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 31.090 s] [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 10.375 s] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 6.908 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.086 s] [INFO] hadoop-yarn ........................................ SUCCESS [ 0.078 s] [INFO] hadoop-yarn-api .................................... SUCCESS [ 42.290 s] [INFO] hadoop-yarn-common ................................. SUCCESS [ 33.780 s] [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.234 s] [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 10.121 s] [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 25.813 s] [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 3.522 s] [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 8.874 s] [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 21.346 s] [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 5.444 s] [INFO] hadoop-yarn-client ................................. SUCCESS [ 7.424 s] [INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 3.477 s] [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.061 s] [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 3.048 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 1.808 s] [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.304 s] [INFO] hadoop-yarn-registry ............................... SUCCESS [ 7.824 s] [INFO] hadoop-yarn-project ................................ SUCCESS [ 14.210 s] [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.089 s] [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 26.171 s] [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 17.911 s] [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 3.927 s] [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 10.348 s] [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 5.593 s] [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 7.499 s] [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.965 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 5.678 s] [INFO] hadoop-mapreduce ................................... SUCCESS [ 4.129 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 5.140 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 16.364 s] [INFO] Apache Hadoop Archives ............................. SUCCESS [ 3.879 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 5.779 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 4.641 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 3.016 s] [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 2.347 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [ 4.353 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 14.688 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 4.571 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [01:29 min] [INFO] Apache Hadoop Azure support ........................ SUCCESS [ 41.832 s] [INFO] Apache Hadoop Client ............................... SUCCESS [ 14.146 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.933 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 4.529 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 11.218 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.022 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:03 min] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 17:33 min [INFO] Finished at: 2016-05-22T06:46:31+08:00 [INFO] Final Memory: 192M/568M [INFO] ------------------------------------------------------------------------
我已经编译好,您也可以直接下载
注意:在编译hadoop或gcc或protobuf时提示还差什么,可自行去下载安装吧。毕竟每个人的系统环境时不一样的。
比如:编译hadoop的时候如果出现这个错误
[ERROR] Failed to execute goal on project hadoop-hdfs: Could not resolve dependencies for project org.apache.hadoop:hadoop-hdfs:jar:2.7.2: Failed to collect dependencies at commons-daemon:commons-daemon:jar:1.0.13: Failed to read artifact descriptor for commons-daemon:commons-daemon:jar:1.0.13: Could not transfer artifact commons-daemon:commons-daemon:pom:1.0.13 from/to repository.jboss.org (http://repository.jboss.org/nexus/content/groups/public/): repository.jboss.org: unknown error: Unknown host repository.jboss.org: unknown error -> [Help 1]您就需要手动下载hadoop-hdfs.jar
pom如下
<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.2</version> </dependency>
注意:如果使用编译好的64位的hadoop时还出现 Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
可以通过以下方法消除
修改hadoop-env.sh 的 HADOOP_OPTS
修改后的代码如下
# Extra Java runtime options. Empty by default.
#export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"
export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib:$HADOOP_PREFIX/lib/native"
爆款云服务器s6 2核4G 低至0.46/天,具体规则查看活动详情