有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

无法从Java连接到在Kubernetes中运行的Spark

我安装了Kuberenetes(适用于Windows 10的minikube),并使用helm添加了Spark:

.\helm.exe install --name spark-test stable/spark

然后我使用

.\kubectl.exe expose deployment spark-test-master --port=7070 --name=spark-master-ext --type=NodePort

例如,我的UI在http://<MINIKUBE_IP>:31905/上运行,spark master暴露于<;MINIKUBE_IP>;:32473.为了检查,我:

.\minikube-windows-amd64.exe service spark-master-ext

但当我在Java中这样做时:

SparkConf conf = new SparkConf().setMaster("spark://192.168.1.168:32473").setAppName("Data Extractor");

我有:

18/03/19 13:57:29 WARN AppClient$ClientEndpoint: Could not connect to 192.168.1.168:32473: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@192.168.1.168:32473]
18/03/19 13:57:29 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkMaster@192.168.1.168:32473] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster@192.168.1.168:32473]] Caused by: [Connection refused: no further information: /192.168.1.168:32473]
18/03/19 13:57:29 WARN AppClient$ClientEndpoint: Failed to connect to master 192.168.1.168:32473
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@192.168.1.168:32473/), Path(/user/Master)]

有什么想法吗,如何在Minikube运行Spark上运行Java Spark作业


共 (1) 个答案

  1. # 1 楼答案

    看起来Spark的Helm chart真的过时了(1.5.1),所以我在本地安装了2.3.0,运行起来没有任何问题。案件结案,抱歉:)