java连接操作hadoop hdfs异常 NoSuchMethodError RPC.getProxy

hadoop | 2019-09-13 10:02:39

我用java操作hdfs,在main中测试很正常,打包部署就报各种错误


刚开始的错误是:

java.io.IOException: No FileSystem for scheme: hdfs


后来的错误是:

java.lang.NoSuchMethodError: org.apache.hadoop.ipc.RPC.getProxy(Ljava/lang/Class;JLjava/net/InetSocketAddress;Lorg/apache/hadoop/security/UserGroupInformation;Lorg/apache/hadoop/conf/Configuration;Ljavax/net/SocketFactory;)Lorg/apache/hadoop/ipc/VersionedProtocol;
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:93)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2701)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2683)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:372)
at com.ajia.data.synchronize.service.dataAnalysis.impl.DataAnalysisImpl.dataAnalysisSparkTask(DataAnalysisImpl.java:230)
at com.ajia.data.synchronize.quartz.DataAnalysisTask.run(DataAnalysisTask.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.util.MethodInvoker.invoke(MethodInvoker.java:269)
at org.springframework.scheduling.quartz.MethodInvokingJobDetailFactoryBean$MethodInvokingJob.executeInternal(MethodInvokingJobDetailFactoryBean.java:322)
at org.springframework.scheduling.quartz.QuartzJobBean.execute(QuartzJobBean.java:112)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)


我仔细思考了,为什么main里面测试可以执行,打包就不可以了,是否打包打的不正常,所以开始看pom和打的报


pom.xml

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.7</version>
</dependency>


解决方法:

后来发现打出的包多出了hadoop-core,而且不知道从哪里依赖过来的,反正我是没引入过,打好的包删除这个了就能正常执行,其实他和我引入的 hadoop jar里面的类冲突了。所以我只能暂时,先主动引入再过滤,后面看看有没有更好的办法彻底删除。

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.0</version>
<scope>provided</scope>
</dependency>


登录后即可回复 登录 | 注册
    
关注编程学问公众号