无法阅读Spark流式处理

2024-06-28 16:08:20 发布

您现在位置:Python中文网/ 问答频道 /正文

我尝试使用Spark Python读取流数据,并更改流数据的数据格式。但我似乎连小溪都看不懂。。。在

以下是我的步骤:

  1. 我打开一个终端,把cd输入到文件夹里,然后输入命令行

    ls part-* | xargs -I % sh -c '{ cat %; sleep 5;}' | nc -lk 9999
    
  2. 然后打开另一个终端,输入setenv SPARK_HOME /user/abc/Downloads/spark-1.5.2-bin-hadoop2.6/,这样我就可以在本地运行Spark。然后我输入命令${SPARK_HOME}/bin/spark-submit --master local /user/abc/test.py localhost 9999来运行我的代码。

下面是代码,我只是测试我是否正在读取流数据,然后更改数据格式。。。但它总是显示错误:16/01/28 22:41:37 INFO ReceiverSupervisorImpl: Starting receiver 16/01/28 22:41:37 INFO ReceiverSupervisorImpl: Called receiver onStart 16/01/28 22:41:37 INFO ReceiverSupervisorImpl: Receiver started again 16/01/28 22:41:37 INFO SocketReceiver: Connecting to localhost:9999 16/01/28 22:41:37 INFO SocketReceiver: Connected to localhost:9999 16/01/28 22:41:37 INFO SocketReceiver: Closed socket to localhost:9999 16/01/28 22:41:37 WARN ReceiverSupervisorImpl: Restarting receiver with delay 2000 ms: Socket data stream had no more data

如果我重新运行ls part-* | xargs -I % sh -c '{ cat %; sleep 5;}' | nc -lk 9999,它仍然显示相同的错误。。。。你知道怎么解决这个问题吗?在

^{pr2}$

Tags: to数据infolocalhost终端shlscat