问题描述
Spark 3.0.2 版
Spark 提交到 YARN 失败,出现 FileNotFoundException 权限被拒绝错误。 有谁知道此版本中是否有任何可能导致此问题的重大更改?
此问题仅在使用 Spark 3.0.2 或更新版本时出现。
使用 3.0.0 或 3.0.1 时不会发生这种情况。也适用于旧版本。
Exception in thread "main" java.io.IOException: Configuration problem with provider path.
at org.apache.hadoop.conf.Configuration.getpasswordFromCredentialProviders(Configuration.java:2364)
at org.apache.hadoop.conf.Configuration.getpassword(Configuration.java:2283)
at org.apache.spark.SSLOptions$.$anonfun$parse$8(SSLOptions.scala:188)
at scala.Option.orElse(Option.scala:447)
at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:98)
at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:78)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:861)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.io.FileNotFoundException: /usr/hdp/current/hive-client/conf/hive-site.jceks (Permission denied)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileInputStream.<init>(RawLocalFileSystem.java:110)
at org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:212)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:147)
at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:347)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:899)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider.getInputStreamForFile(JavaKeyStoreProvider.java:70)
at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.locateKeystore(AbstractJavaKeyStoreProvider.java:321)
at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:86)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:49)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:41)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:100)
at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:73)
at org.apache.hadoop.conf.Configuration.getpasswordFromCredentialProviders(Configuration.java:2345)
... 8 more
解决方法
暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!
如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。
小编邮箱:dio#foxmail.com (将#修改为@)