有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

使用kafka运行Spark流媒体示例时出现java NoSuchMethodError

我正在使用spark-2.4.4-bin-without-hadoop,我想测试自包含的示例JavaDirectKafkaWordCountexample

从官方文件中,它提到了应用程序应该包括这个依赖项。因此,我将spark-streaming-kafka-0-10_2.12-2.4.0.jar下载到jars目录

但是,当我运行run-example streaming.JavaDirectKafkaWordCount device1:9092 group_id topic时,它会显示 NoSuchMethodError

20/01/13 11:51:12 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1549bba7{/metrics/json,null,AVAILABLE,@Spark}
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
        at org.apache.spark.streaming.kafka010.PreferConsistent$.<init>(LocationStrategy.scala:42)
        at org.apache.spark.streaming.kafka010.PreferConsistent$.<clinit>(LocationStrategy.scala)
        at org.apache.spark.streaming.kafka010.LocationStrategies$.PreferConsistent(LocationStrategy.scala:66)
        at org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent(LocationStrategy.scala)
        at org.apache.spark.examples.streaming.JavaDirectKafkaWordCount.main(JavaDirectKafkaWordCount.java:84)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
20/01/13 11:51:12 INFO spark.SparkContext: Invoking stop() from shutdown hook

共 (1) 个答案

  1. # 1 楼答案

    根据文件- 您必须将流媒体应用程序编译成JAR。如果使用spark submit启动应用程序,则不需要在JAR中提供spark和spark流。但是,如果应用程序使用高级源(例如Kafka、Flume),则必须将它们链接到的额外工件及其依赖项打包到用于部署应用程序的JAR中。例如,使用KafkaUtils的应用程序必须在应用程序JAR中包含spark-streaming-kafka-0-10_2.12及其所有可传递依赖项

    或者,您可以使用“packages org.apache.spark:spark-sql-kafka-0-10_2.12:2.4.0”和spark submit命令