Maven Fat Jar不执行

问题描述

我的Maven如下,我正在使用Flink消费者阅读Kafka消息,它在独立运行时工作正常,

<dependencies>
       <dependency>
           <groupId>com.datastax.oss</groupId>
           <artifactId>java-driver-core</artifactId>
           <version>4.8.0</version>
       </dependency>
       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-connector-cassandra_2.11</artifactId>
           <version>1.11.0</version>
       </dependency>
       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-connector-kafka_2.11</artifactId>
           <version>1.11.0</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-java_2.12</artifactId>
           <version>1.11.0</version>
       </dependency>

       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-streaming-scala_2.12</artifactId>
           <version>1.11.0</version>
       </dependency>
       <dependency>
           <groupId>org.apache.flink</groupId>
           <artifactId>flink-clients_2.12</artifactId>
           <version>1.11.0</version>
       </dependency>
      .
      .
      .
   </dependencies>
   <build>
       <plugins>
           <plugin>
               <artifactId>maven-assembly-plugin</artifactId>
               <version>3.3.0</version>
               <configuration>
                   <descriptorRefs>
                       <descriptorRef>jar-with-dependencies</descriptorRef>
                   </descriptorRefs>
                   <finalName>streaming-job-1.0-RELEASE</finalName>
                   <appendAssemblyId>false</appendAssemblyId>
                   <archive>
                       <manifest>
                           <mainClass>com.sample.SampleJob</mainClass>
                       </manifest>
                   </archive>
               </configuration>
               <executions>
                   <execution>
                       <id>make-assembly</id>
                       <phase>package</phase>
                       <goals>
                           <goal>single</goal>
                       </goals>
                   </execution>
               </executions>
           </plugin>
       </plugins>
   </build>
</project>

和Java Flink Kafka Consumer的一段代码

final FlinkKafkaConsumer<String> flinkConsumer = new FlinkKafkaConsumer<>
                                             (kafkaTopicName,new SimpleStringSchema(),prop);
        flinkConsumer.setStartFromEarliest();

        final DataStream<String> stream = env.addSource(flinkConsumer);
        DataStream<Person> sensorStreaming = stream.flatMap(new FlatMapFunction<String,Person>() {
            @Override
            public void flatMap(String value,Collector<Person> out) throws Exception {
               
            }
        });
        System.out.println("env >>>= "+env);
        env.execute();

当我运行代码时,独立运行的代码可以正常工作,并且在创建Far Jar后无法正常工作,

Stacktrace是

env >>>= org.apache.flink.streaming.api.environment.LocalStreamEnvironment@533b266e
Exception in thread "main" java.lang.IllegalStateException: No ExecutorFactory found to execute the application.
    at org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:84)
    at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:1803)
    at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1713)
    at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74)
    at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1699)
    at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1681)

我可以看到所有依赖项都在Fat Jar中,

enter image description here

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)