eclipse 下运行第一个hadoop程序 no filesystem for scheme: hdfs
今天在eclipse下写了第一个hadoop程序,简单的在服务器上创建文件夹
代码如下:
/** * Hello hadoop! * */ public class Client{ @SuppressWarnings("unused") private FileSystem fs; public void getFS() throws IOException{ Configuration conf = new Configuration(); //设置文件系统所在的位置 conf.set("fs.defaultFS", "MyCentOS:9000"); //设置备份数量 conf.set("dfs.replication", "2"); fs = FileSystem.get(conf); } public void Mkdir() throws IllegalArgumentException, IOException{ getFS(); fs.mkdirs(new Path("/abcde/")); System.out.println("1111111111111"); } public static void main(String[] args){ Client client = new Client(); try { client.Mkdir(); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } }
log4j:WARN No appenders could be found for logger (org.apache.hadoop.fs.FileSystem). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. java.io.IOException: No FileSystem for scheme: hdfs at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2644) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)出现这个错误的主要原因是因为我没有导入hadoop-hdfs.jar这个包
在项目的pom下加入如下代码即可
<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.2</version> </dependency>完整的pom.xml代码如下
<dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.2</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.2</version> </dependency> </dependencies>
爆款云服务器s6 2核4G 低至0.46/天,具体规则查看活动详情