Spark 开发过程中的io.netty冲突问题

io.netty冲突问题

参考文章
https://blog.csdn.net/weixin_43777983/article/details/104558048

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.defaultUseCacheForAllThreads()Z
	at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:120)
	at org.apache.spark.network.server.TransportServer.init(TransportServer.java:98)
	at org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:75)
	at org.apache.spark.network.TransportContext.createServer(TransportContext.java:114)
	at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:119)
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:465)
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:464)
	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2269)
	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2261)
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:469)
	at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
	at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:838)
	at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:85)

产生异常的原因:
项目中,Maven自动引用的 netty 版本太低或者缺失

1.在指定依赖中,exclusion其中的netty依赖。(我遇到的问题这里可以省略,直接在第二步中指定高版本netty就解决了)

<dependency>
            <groupId>oXXXXXX</groupId>
            <artifactId>sXXXXXX</artifactId>
            <version>XXXXX</version>
            <!--将netty包排除-->
            <exclusions>
                <exclusion>
                    <groupId>io.netty</groupId>
                    <artifactId>netty</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

2.在pom中添加更高版本的netty

<!--解决io.netty.buffer.PooledByteBufAllocator.defaultUseCacheForAllThreads()Z异常-->
       <dependency>
           <groupId>io.netty</groupId>
           <artifactId>netty-all</artifactId>
           <version>4.1.18.Final</version>
       </dependency>

相关文章

1.SparkStreaming是什么?SparkStreaming是SparkCore的扩展A...
本篇内容介绍了“Spark通讯录相似度计算怎么实现”的有关知识...
本篇文章给大家分享的是有关如何进行Spark数据分析,小编觉得...
本篇内容主要讲解“Spark Shuffle和Hadoop Shuffle有哪些区别...
这篇文章主要介绍“TSDB的数据怎么利用Hadoop/spark集群做数...
本篇内容介绍了“Hadoop与Spark性能原理是什么”的有关知识,...