给定密钥表文件、主体和其他详细信息,是否可以将文件从 Kerberized HDFS 读取到非 Kerberized Spark 集群?

问题描述

我需要在非 Kerberized Spark 集群中使用 webHDFS 从 Kerberized HDFS 集群读取数据。我可以访问 Keytab 文件用户名/主体,并且可以访问登录所需的任何其他详细信息。我需要以编程方式登录并允许我的 Spark 集群从 Kerberized HDFS 读取文件

看起来我可以使用这段代码登录

System.setProperty("java.security.krb5.kdc","<kdc>")
System.setProperty("java.security.krb5.realm","<realm>")
val conf = new Configuration()
conf.set("hadoop.security.authentication","kerberos")
UserGroupinformation.setConfiguration(conf)
UserGroupinformation.loginUserFromKeytab("<my-principal>","<keytab-file-path>/keytabFile.keytab")

webHDFS here的官方文档中,我看到我可以像这样在hadoop配置中配置keytab文件路径和principal:

sparkSession.sparkContext.hadoopConfiguration.set("dfs.web.authentication.kerberos.principal","<my-principal>")
sparkSession.sparkContext.hadoopConfiguration.set("dfs.web.authentication.kerberos.keytab","<keytab-file-path>/keytabFile.keytab")

在此之后,我应该能够使用以下命令从 Kerberized HDFS 集群中读取文件

val irisDFWebHDFS = sparkSession.read
  .format("csv")
  .option("header","true")
  .csv(s"webhdfs://<namenode-host>:<namenode-port>/user/hadoop/iris.csv")

但它仍然拒绝读取并抛出以下异常:

org.apache.hadoop.security.AccessControlException: Authentication required
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:490)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$300(WebHdfsFileSystem.java:135)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.connect(WebHdfsFileSystem.java:721)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:796)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:619)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:657)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:422)
  at org.apache.hadoop.security.UserGroupinformation.doAs(UserGroupinformation.java:1730)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:653)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:1741)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getDelegationToken(WebHdfsFileSystem.java:365)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getAuthParameters(WebHdfsFileSystem.java:585)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toUrl(WebHdfsFileSystem.java:608)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractFsPathRunner.getUrl(WebHdfsFileSystem.java:898)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:794)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:619)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:657)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:422)
  at org.apache.hadoop.security.UserGroupinformation.doAs(UserGroupinformation.java:1730)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:653)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHdfsFileStatus(WebHdfsFileSystem.java:1086)
  at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getFileStatus(WebHdfsFileSystem.java:1097)
  at org.apache.hadoop.fs.FileSystem.isDirectory(FileSystem.java:1707)
  at org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:47)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:376)
  at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:326)
  at org.apache.spark.sql.DataFrameReader.$anonfun$load$3(DataFrameReader.scala:308)
  at scala.Option.getorElse(Option.scala:189)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:308)
  at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:796)
  at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:596)
  ... 44 elided

关于我可能遗漏的任何指示?

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)