问题描述
我正在尝试通过火花读取存储在Solr中的数据,但是我无法在下面显示它。 但是,我在Livy配置中传递了以下驱动程序。 我尝试了其他驱动程序版本,但无济于事。 我的Spark是2.3.1版和Solr 7.4.0版。
我已经尝试删除Solr-solrj.jar,因为spark-solr包含它,但是也没有成功。
Livy配置:
df = spark.read.format('solr').option("zkhost","zk-host:2181").option('collection','collection_name').load()
Pyspark代码:
An error was encountered:
An error occurred while calling o135.load.
: java.lang.NoClassDefFoundError: org/apache/solr/client/solrj/io/stream/expr/StreamExpressionParameter
at solr.DefaultSource.createRelation(DefaultSource.scala:12)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:341)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.classNotFoundException: org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParameter
at java.net.urlclassloader.findClass(urlclassloader.java:381)
at java.lang.classLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.classLoader.loadClass(ClassLoader.java:357)
... 16 more
Traceback (most recent call last):
File "/dados01/yarn/local/usercache/livy/appcache/application_1592863719820_6421/container_e574_1592863719820_6421_01_000001/pyspark.zip/pyspark/sql/readwriter.py",line 172,in load
return self._df(self._jreader.load())
File "/dados01/yarn/local/usercache/livy/appcache/application_1592863719820_6421/container_e574_1592863719820_6421_01_000001/py4j-0.10.7-src.zip/py4j/java_gateway.py",line 1257,in __call__
answer,self.gateway_client,self.target_id,self.name)
File "/dados01/yarn/local/usercache/livy/appcache/application_1592863719820_6421/container_e574_1592863719820_6421_01_000001/pyspark.zip/pyspark/sql/utils.py",line 63,in deco
return f(*a,**kw)
File "/dados01/yarn/local/usercache/livy/appcache/application_1592863719820_6421/container_e574_1592863719820_6421_01_000001/py4j-0.10.7-src.zip/py4j/protocol.py",line 328,in get_return_value
format(target_id,".",name),value)
py4j.protocol.Py4JJavaError: An error occurred while calling o135.load.
: java.lang.NoClassDefFoundError: org/apache/solr/client/solrj/io/stream/expr/StreamExpressionParameter
at solr.DefaultSource.createRelation(DefaultSource.scala:12)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:341)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.classNotFoundException: org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParameter
at java.net.urlclassloader.findClass(urlclassloader.java:381)
at java.lang.classLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.classLoader.loadClass(ClassLoader.java:357)
... 16 more
错误:
// define ('WP_ALLOW_MULTISITE',true);
// define ('MULTISITE',true);
// define ('SUBDOMAIN_INSTALL',true);
// $ base = '/';
// define ('DOMAIN_CURRENT_SITE','example.com');
// define ('PATH_CURRENT_SITE','/');
// define ('SITE_ID_CURRENT_SITE',1);
// define ('BLOG_ID_CURRENT_SITE',1);
解决方法
问题解决如下:
我进入了所有运行Spark以及hdfs的服务器,寻找在spark系统路径中的solr jar(spark-solr-3.6.0.jar),因为该jar位于Yarn的一个缓存中(/ data / yarn / local / filecache或/ data / yarn / local / usercache),我从所有缓存位置删除了jar,因此它不再显示在spark的系统路径中。 在那之后,我刚刚为Livy导入了必要的罐子,它再次起作用。 该jar看起来已损坏,因此将其导入到Spark中,但这些库无法正常工作