GemFire:org.apache.geode.GemFireConfigException:SSL配置需要有效的分发配置

问题描述

我尝试从外部系统连接GemFire,但出现以下错误。我应该在哪里配置SSL?

        WARN LocatorHelper$: getAllGeodeServers error
                org.apache.geode.GemFireConfigException: SSL Configuration requires a valid distribution config.
                        at org.apache.geode.internal.net.SSLConfigurationFactory.getDistributionConfig(SSLConfigurationFactory.java:55)
                        at org.apache.geode.internal.net.SSLConfigurationFactory.createSSLConfigForComponent(SSLConfigurationFactory.java:89)
                        at org.apache.geode.internal.net.SSLConfigurationFactory.getSSLConfigForComponent(SSLConfigurationFactory.java:76)
                        at org.apache.geode.internal.net.SocketCreatorFactory.getSocketCreatorForComponent(SocketCreatorFactory.java:69)
                        at org.apache.geode.distributed.internal.tcpserver.TcpClient.<init>(TcpClient.java:74)
                        at org.apache.geode.spark.connector.internal.LocatorHelper$$anonfun$getAllGeodeServers$1.apply(LocatorHelper.scala:71)
                        at org.apache.geode.spark.connector.internal.LocatorHelper$$anonfun$getAllGeodeServers$1.apply(LocatorHelper.scala:67)
                        at scala.collection.IndexedSeqOptimized$$anonfun$1.apply(IndexedSeqOptimized.scala:50)
                        at scala.collection.IndexedSeqOptimized$$anonfun$1.apply(IndexedSeqOptimized.scala:50)
                        at scala.collection.IndexedSeqOptimized$class.segmentLength(IndexedSeqOptimized.scala:195)
                        at scala.collection.mutable.ArraySeq.segmentLength(ArraySeq.scala:46)
                        at scala.collection.GenSeqLike$class.prefixLength(GenSeqLike.scala:93)
                        at scala.collection.AbstractSeq.prefixLength(Seq.scala:41)
                        at scala.collection.IndexedSeqOptimized$class.find(IndexedSeqOptimized.scala:50)
                        at scala.collection.mutable.ArraySeq.find(ArraySeq.scala:46)
                        at org.apache.geode.spark.connector.internal.LocatorHelper$.getAllGeodeServers(LocatorHelper.scala:67)
                        at org.apache.geode.spark.connector.internal.DefaultGeodeConnection.getClientCacheFactory(DefaultGeodeConnection.scala:71)
                        at org.apache.geode.spark.connector.internal.DefaultGeodeConnection.initClientCache(DefaultGeodeConnection.scala:58)
                        at org.apache.geode.spark.connector.internal.DefaultGeodeConnection.<init>(DefaultGeodeConnection.scala:50)
                        at org.apache.geode.spark.connector.internal.DefaultGeodeConnectionFactory.newConnection(DefaultGeodeConnection.scala:175)
                        at org.apache.geode.spark.connector.internal.DefaultGeodeConnectionManager$.getConnection(DefaultGeodeConnectionManager.scala:59)
                        at org.apache.geode.spark.connector.internal.DefaultGeodeConnectionManager.getConnection(DefaultGeodeConnectionManager.scala:29)
                        at org.apache.geode.spark.connector.GeodeConnectionConf.getConnection(GeodeConnectionConf.scala:39)
                        at org.apache.geode.spark.connector.internal.oql.QueryRDD.getPartitions(QueryRDD.scala:39)
                        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:253)
                        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:251)
                        at scala.Option.getOrElse(Option.scala:121)
                        at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)
                        at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1343)
                        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
                        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
                        at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
                        at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
                        at org.apache.spark.rdd.RDD$$anonfun$first$1.apply(RDD.scala:1378)
                        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
                        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
                        at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
                        at org.apache.spark.rdd.RDD.first(RDD.scala:1377)
                        at org.apache.geode.spark.connector.internal.oql.SchemaBuilder.toSparkSchema(SchemaBuilder.scala:51)
                        at org.apache.geode.spark.connector.internal.oql.OQLRelation.schema(RDDConverter.scala:29)
                        at org.apache.spark.sql.execution.datasources.LogicalRelation$.apply(LogicalRelation.scala:71)
                        at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:432)
                        at org.apache.spark.sql.SQLContext.baseRelationToDataFrame(SQLContext.scala:295)
                        at org.apache.geode.spark.connector.GeodeSQLContextFunctions.geodeOQL(GeodeSQLContextFunctions.scala:40)
                        at $line18.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:28)
                        at $line18.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:33)
                        at $line18.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:35)
                        at $line18.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:37)
                        at $line18.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:39)
                        at $line18.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:41)
                        at $line18.$read$$iw$$iw$$iw$$iw.<init>(<console>:43)
                        at $line18.$read$$iw$$iw$$iw.<init>(<console>:45)
                        at $line18.$read$$iw$$iw.<init>(<console>:47)
                        at $line18.$read$$iw.<init>(<console>:49)
                        at $line18.$read.<init>(<console>:51)
                        at $line18.$read$.<init>(<console>:55)
                        at $line18.$read$.<clinit>(<console>)
                        at $line18.$eval$.$print$lzycompute(<console>:7)
                        at $line18.$eval$.$print(<console>:6)
                        at $line18.$eval.$print(<console>)
                        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                        at java.lang.reflect.Method.invoke(Method.java:498)
                        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
                        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
                        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
                        at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
                        at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
                        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
                        at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
                        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
                        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
                        at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:819)
                        at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:691)
                        at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:404)
                        at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:425)
                        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:244)
                        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:141)
                        at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:141)
                        at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
                        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:141)
                        at org.apache.spark.repl.Main$.doMain(Main.scala:76)
                        at org.apache.spark.repl.Main$.main(Main.scala:56)
                        at org.apache.spark.repl.Main.main(Main.scala)
                        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                        at java.lang.reflect.Method.invoke(Method.java:498)
                        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
                        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:900)
                        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:192)
                        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:217)
                        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
                        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

相关问答

错误1:Request method ‘DELETE‘ not supported 错误还原:...
错误1:启动docker镜像时报错:Error response from daemon:...
错误1:private field ‘xxx‘ is never assigned 按Alt...
报错如下,通过源不能下载,最后警告pip需升级版本 Requirem...