如何解决spark hive 权限不够的问题

spark | 2019-09-16 11:30:37

spark操作hive出现异常如下:

19/09/16 09:58:27 INFO metastore: Connected to metastore.
19/09/16 09:58:28 INFO SessionState: Created HDFS directory: /tmp/hive/root/9ad612d6-a6a1-4f80-be2c-fdc87e0a2a0c
19/09/16 09:58:28 INFO SessionState: Created local directory: /opt/hadoop/data/hive/iotmp/9ad612d6-a6a1-4f80-be2c-fdc87e0a2a0c
19/09/16 09:58:28 INFO SessionState: Created HDFS directory: /tmp/hive/root/9ad612d6-a6a1-4f80-be2c-fdc87e0a2a0c/_tmp_space.db

org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: 权限不够;
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.(HiveSessionStateBuilder.scala:69)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
	at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:432)
	at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:233)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
	at com.report.tool.datacenter.engine.spark.analysis.service.LoadDataService.jdbcPartition(LoadDataService.scala:102)
	at com.report.tool.datacenter.engine.spark.analysis.service.LoadDataService.loadMysqlData(LoadDataService.scala:55)
	at com.report.tool.datacenter.engine.spark.analysis.DataAnalysisMain$.main(DataAnalysisMain.scala:53)
	at com.report.tool.datacenter.engine.spark.analysis.DataAnalysisMain.main(DataAnalysisMain.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:721)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: 权限不够
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
	at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:114)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
	... 29 more
Caused by: java.lang.RuntimeException: java.io.IOException: 权限不够
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515)
	... 44 more
Caused by: java.io.IOException: 权限不够
	at java.io.UnixFileSystem.createFileExclusively(Native Method)
	at java.io.File.createTempFile(File.java:2024)
	at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState.java:818)
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
	... 44 more
19/09/16 09:58:28 ERROR ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: 权限不够;
org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: 权限不够;
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)

 

从异常来看,在创建几个文件(at java.io.File.createTempFile(File.java:2024)),但是权限不够,要如何解决,配置什么配置?

 

登录后即可回复 登录 | 注册
    
  • admin
    admin

     我配置了hive.exec.local.scratchdir为/opt/hadoop/data/hive/iotmp,给这个目录授权就可以了

    chmod -R 777 /opt/hadoop/data/hive/iotmp

关注编程学问公众号