Hadoop部署在RedHat Linux 5上常见错误及解决方案

Hadoop部署在RedHat Linux 5上常见错误及解决方案 

遇到的问题:

1、在hadoop conf下执行命令: hadoop-daemon.sh start datanode,无法启动hadoop datanode:

[hadoop@master conf]$ hadoop-daemon.sh start datanode

Warning: $HADOOP_HOME is deprecated.

 

starting datanode, logging to /opt/modules/hadoop/hadoop-1.1.2/libexec/../logs/hadoop-hadoop-datanode-master.out

[hadoop@master conf]$ jps

4662 NameNode

4819 Jps

[hadoop@master conf]$ tail -300f /opt/modules/hadoop/hadoop-1.1.2/libexec/../logs/hadoop-hadoop-datanode-master.log

报错:

Incompatible namespaceIDs in /data/hadoop/hdfs/data: namenode namespaceID = 573007068; datanode namespaceID = 1802250800

解决方法:用网上介绍的第一种:删除data目录(即是在hdfs-site.xml文件中配置的dfs.data.dir目录)然后重启datanode。

[hadoop@master conf]$ rm -rf /data/hadoop/hdfs/data

[hadoop@master conf]$ ll /data/hadoop/hdfs/

total 8

drwxrwxr-x 5 hadoop hadoop 4096 Jul 6 08:17 name

drwxrwxr-x 2 hadoop hadoop 4096 Jul 2 06:59 namesecondary

[hadoop@master conf]$ hadoop-daemon.sh start datanode

2、启动Jobtracker时报错:

2013-07-06 10:03:09,584 INFO org.apache.hadoop.mapred.JobTracker: problem cleaning system directory: hdfs://master:9000/data/hadoop/mapred/mrsystem

org.apache.hadoop.ipc.RemoteException: java.io.FileNotFoundException: Parent path is not a directory: /data

原因是:有个/data文件(注意不是文件夹,跟参数刚好不相匹配),也可以通过网页中看:

NameNode 'master:9000'

Started: Sat Jul 06 09:54:05 PDT 2013

Version: 1.1.2, r1440782

Compiled: Thu Jan 31 02:03:24 UTC 2013 by hortonfo

Upgrades: There are no upgrades in progress.

 

Browse the filesystem

Namenode Logs

解决方法:删除掉改文件:hadoop fs -rmr /data 即ok

3、在hadoop中编译eclipse导出的jar包时报错:

[hadoop@master hadoop-1.1.2]$ hadoop jar hdfs1.jar com.hadoop.hdfs.HDFSMkdir

Warning: $HADOOP_HOME is deprecated.

 

Exception in thread "main" java.lang.UnsupportedClassVersionError: com/hadoop/hdfs/HDFSMkdir : Unsupported major.minor version 51.0

原因:eclipse中所用java版本与hadoop所用java版本不一致。

解决方法:更改eclipse中的java版本重新导出jar包。

内容版权声明:除非注明,否则皆为本站原创文章。

转载注明出处:http://www.heiqu.com/9a29908a3553d28c4d142a2808608642.html