有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

java如何为spark流检查点设置检查点间隔?

我想根据官方文档为我的python spark流脚本设置检查点间隔:

For stateful transformations that require RDD checkpointing, the default interval is a multiple of the batch interval that is at least 10 seconds. It can be set by using dstream.checkpoint(checkpointInterval). Typically, a checkpoint interval of 5 - 10 times of sliding interval of a DStream is good setting to try.

我的脚本:

import sys

from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils

def functionToCreateContext():
    sc = SparkContext(appName="PythonStreamingDirectKafkaWordCount")
    ssc = StreamingContext(sc, 6)
    ssc.checkpoint("./checkpoint")
    kvs = KafkaUtils.createDirectStream(ssc, ['test123'], {"metadata.broker.list": "localhost:9092"})

    kvs = kvs.checkpoint(60) #set the checkpoint interval

    lines = kvs.map(lambda x: x[1])
    counts = lines.flatMap(lambda line: line.split(" ")) \
        .map(lambda word: (word, 1)) \
        .reduceByKey(lambda a, b: a+b)
    counts.pprint()
    return ssc

if __name__ == "__main__":
    ssc = StreamingContext.getOrCreate("./checkpoint", functionToCreateContext)

    ssc.start()
    ssc.awaitTermination()

运行脚本后的输出:

16/05/25 17:49:03 INFO DirectKafkaInputDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Remember duration = 120000 ms
16/05/25 17:49:03 INFO DirectKafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.DirectKafkaInputDStream@1be80174
16/05/25 17:49:03 INFO PythonTransformedDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Storage level = StorageLevel(false, true, false, false, 1)
16/05/25 17:49:03 INFO PythonTransformedDStream: Checkpoint interval = 60000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Remember duration = 120000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Initialized and validated org.apache.spark.streaming.api.python.PythonTransformedDStream@69f9a089
16/05/25 17:49:03 INFO PythonTransformedDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO PythonTransformedDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO PythonTransformedDStream: Remember duration = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Initialized and validated org.apache.spark.streaming.api.python.PythonTransformedDStream@d97386a
16/05/25 17:49:03 INFO PythonTransformedDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO PythonTransformedDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO PythonTransformedDStream: Remember duration = 6000 ms
16/05/25 17:49:03 INFO PythonTransformedDStream: Initialized and validated org.apache.spark.streaming.api.python.PythonTransformedDStream@16c474ad
16/05/25 17:49:03 INFO ForEachDStream: Slide time = 6000 ms
16/05/25 17:49:03 INFO ForEachDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/25 17:49:03 INFO ForEachDStream: Checkpoint interval = null
16/05/25 17:49:03 INFO ForEachDStream: Remember duration = 6000 ms
..........

数据流检查点间隔仍然为空。有什么想法吗


共 (1) 个答案

  1. # 1 楼答案

    在创建流之后,尝试将这一行向下移动几行:ssc.checkpoint("./checkpoint")

    基本上,在完全准备好流之后执行检查点