hdfs修改端口后hive连接失败

2019-09-13 10:09:26 | 编辑

我把hdfs的端口修改了,fs.defaultFS hdfs://master:8020修改成hdfs://master:9000。
然后spark程序提交就出现问题了

org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.net.ConnectException: Call From slave203/10.10.22.203 to master123:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused);

我直接运行hive也出现问题。

#hive
log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender.

Logging initialized using configuration in file:/etc/hive/2.6.5.0-292/0/hive-log4j.properties
hive> show tables;
FAILED: SemanticException MetaException(message:java.net.ConnectException: Call From slave203/10.10.22.203 to master123:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused)

我很确定是hive连不上,可是hive没有配置过hdfs啊,这个要怎么解决?

登录后即可回复 登录 | 注册
    
  • admin
    admin

    我用的HDP,我多重启了一次hive居然又好了,就这么简单,通过重启就能解决

关注编程学问公众号