星火流StreamingContext.start()

编程入门 行业动态 更新时间:2024-10-24 16:22:49
本文介绍了星火流StreamingContext.start() - 错误启动接收器0的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我有一个使用火花流一个项目,我与'火花提交运行它,但我打这个错误:

I have a project that's using spark streaming and I'm running it with 'spark-submit' but I'm hitting this error:

15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError at org.apache.spark.Logging$class.log(Logging.scala:52) at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66) at org.apache.spark.Logging$class.logInfo(Logging.scala:59) at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66) at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86) at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121) at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106) at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264) at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257) at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62) at org.apache.spark.scheduler.Task.run(Task.scala:54) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

这是code。该错误是从哪里来的,一切都正常运行,直到ssc.start()

This is the code that the error is coming from, everything runs fine up until ssc.start()

val Array(zkQuorum, group, topics, numThreads) = args val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer") val ssc = new StreamingContext(sparkConf, Seconds(2)) ssc.checkpoint("checkpoint") . . . ssc.start() ssc.awaitTermination()

我已经运行使用'火花提交和它运行良好,所以我似乎无法弄清楚是什么导致我的应用程序中的问题SparkPi例如,任何帮助将是非常美联社preciated。

I've run the SparkPi example using 'spark-submit' and it runs fine so I can't seem to figure out what's causing the problem on my application, any help would be really appreciated.

推荐答案

从文件 java.lang.AbstractMethod :

通常情况下,这个错误是由编译器捕获;这个错误只能  在运行时发生,如果一些类的定义发生了不相容  改变,因为当前正在执行的方法,最后编译。

Normally, this error is caught by the compiler; this error can only occur at run time if the definition of some class has incompatibly changed since the currently executing method was last compiled.

这意味着有编译和运行时依赖之间的版本不兼容。确保你对准那些版本来解决这个问题。

This means that there's a version incompatibility between the compile and runtime dependencies. Make sure you align those versions to solve this issue.

更多推荐

星火流StreamingContext.start()

本文发布于:2023-11-26 00:46:10,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1631986.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:星火   StreamingContext   start

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!