有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

scala SPARK:java。lang.IllegalStateException:找不到任何生成目录

我在同一台计算机上配置了主设备和从设备。我将从设备连接到主设备,并且从设备正确地显示为工作设备。然而,每当我尝试使用集群时,主集群和从集群都会出错

主日志:

C:\Users\Niels>"C:\Program Files\Java\jre1.8.0_271\bin\java" -cp "C:\spark/conf\;C:\spark\jars\*" -Xmx1g org.apache.spark.deploy.master.Master --host 192.168.0.184 --port 7077 --webui-port 8080
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/11/01 02:48:47 INFO Master: Started daemon with process name: 17380@XPS13-Niels
20/11/01 02:48:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/11/01 02:48:47 INFO SecurityManager: Changing view acls to: Niels
20/11/01 02:48:47 INFO SecurityManager: Changing modify acls to: Niels
20/11/01 02:48:47 INFO SecurityManager: Changing view acls groups to:
20/11/01 02:48:47 INFO SecurityManager: Changing modify acls groups to:
20/11/01 02:48:47 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Niels); groups with view permissions: Set(); users  with modify permissions: Set(Niels); groups with modify permissions: Set()
20/11/01 02:48:50 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
20/11/01 02:48:50 INFO Master: Starting Spark master at spark://192.168.0.184:7077
20/11/01 02:48:50 INFO Master: Running Spark version 3.0.1
20/11/01 02:48:50 INFO Utils: Successfully started service 'MasterUI' on port 8080.
20/11/01 02:48:50 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://192.168.0.184:8080
20/11/01 02:48:50 INFO Master: I have been elected leader! New state: ALIVE
20/11/01 02:49:37 INFO Master: Registering worker 192.168.0.184:50217 with 8 cores, 14.8 GiB RAM
20/11/01 02:50:31 INFO Master: Registering app Spark shell
20/11/01 02:50:31 INFO Master: Registered app Spark shell with ID app-20201101025031-0000
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/0 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/0 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/1 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/1 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/2 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/2 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/3 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/3 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/4 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/4 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/5 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/5 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/6 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/6 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/7 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/7 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/8 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/8 because it is FAILED
20/11/01 02:50:31 INFO Master: Launching executor app-20201101025031-0000/9 on worker worker-20201101024937-192.168.0.184-50217
20/11/01 02:50:31 INFO Master: Removing executor app-20201101025031-0000/9 because it is FAILED
20/11/01 02:50:31 ERROR Master: Application Spark shell with ID app-20201101025031-0000 failed 10 times; removing it
20/11/01 02:50:31 INFO Master: Removing app app-20201101025031-0000
20/11/01 02:50:31 INFO Master: Received unregister request from application app-20201101025031-0000
20/11/01 02:50:31 INFO Master: 192.168.0.177:61106 got disassociated, removing it.
20/11/01 02:50:31 INFO Master: GARP-W10-SC.Garmat.local:61097 got disassociated, removing it.

从属日志:

C:\Users\Niels>"C:\Program Files\Java\jre1.8.0_271\bin\java" -cp "C:\spark/conf\;C:\spark\jars\*" -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://192.168.0.184:7077
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/11/01 02:49:35 INFO Worker: Started daemon with process name: 12840@XPS13-Niels
20/11/01 02:49:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/11/01 02:49:36 INFO SecurityManager: Changing view acls to: Niels
20/11/01 02:49:36 INFO SecurityManager: Changing modify acls to: Niels
20/11/01 02:49:36 INFO SecurityManager: Changing view acls groups to:
20/11/01 02:49:36 INFO SecurityManager: Changing modify acls groups to:
20/11/01 02:49:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Niels); groups with view permissions: Set(); users  with modify permissions: Set(Niels); groups with modify permissions: Set()
20/11/01 02:49:37 INFO Utils: Successfully started service 'sparkWorker' on port 50217.
20/11/01 02:49:37 INFO Worker: Starting Spark worker 192.168.0.184:50217 with 8 cores, 14.8 GiB RAM
20/11/01 02:49:37 INFO Worker: Running Spark version 3.0.1
20/11/01 02:49:37 INFO Worker: Spark home: C:\spark
20/11/01 02:49:37 INFO ResourceUtils: ==============================================================
20/11/01 02:49:37 INFO ResourceUtils: Resources for spark.worker:

20/11/01 02:49:37 INFO ResourceUtils: ==============================================================
20/11/01 02:49:37 INFO Utils: Successfully started service 'WorkerUI' on port 8081.
20/11/01 02:49:37 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://192.168.0.184:8081
20/11/01 02:49:37 INFO Worker: Connecting to master 192.168.0.184:7077...
20/11/01 02:49:37 INFO TransportClientFactory: Successfully created connection to /192.168.0.184:7077 after 36 ms (0 ms spent in bootstraps)
20/11/01 02:49:37 INFO Worker: Successfully registered with master spark://192.168.0.184:7077
20/11/01 02:50:31 INFO Worker: Asked to launch executor app-20201101025031-0000/0 for Spark shell
20/11/01 02:50:31 INFO SecurityManager: Changing view acls to: Niels
20/11/01 02:50:31 INFO SecurityManager: Changing modify acls to: Niels
20/11/01 02:50:31 INFO SecurityManager: Changing view acls groups to:
20/11/01 02:50:31 INFO SecurityManager: Changing modify acls groups to:
20/11/01 02:50:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Niels); groups with view permissions: Set(); users  with modify permissions: Set(Niels); groups with modify permissions: Set()
20/11/01 02:50:31 ERROR ExecutorRunner: Error running executor
java.lang.IllegalStateException: Cannot find any build directories.
        at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
        at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:250)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:200)
        at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:121)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
        at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:45)
        at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:63)
        at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:51)
        at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:158)
        at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:79)
20/11/01 02:50:31 INFO Worker: Executor app-20201101025031-0000/0 finished with state FAILED message java.lang.IllegalStateException: Cannot find any build directories.
20/11/01 02:50:31 INFO ExternalShuffleBlockResolver: Clean up non-shuffle and non-RDD files associated with the finished executor 0
20/11/01 02:50:31 INFO ExternalShuffleBlockResolver: Executor is not registered (appId=app-20201101025031-0000, execId=0)
20/11/01 02:50:31 INFO Worker: Asked to launch executor app-20201101025031-0000/1 for Spark shell
20/11/01 02:50:31 INFO SecurityManager: Changing view acls to: Niels
20/11/01 02:50:31 INFO SecurityManager: Changing modify acls to: Niels
20/11/01 02:50:31 INFO SecurityManager: Changing view acls groups to:
20/11/01 02:50:31 INFO SecurityManager: Changing modify acls groups to:
20/11/01 02:50:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Niels); groups with view permissions: Set(); users  with modify permissions: Set(Niels); groups with modify permissions: Set()

我已经重新安装了Hadoop、Spark和Scala。我还添加了所有必要的变量

有人知道这个问题的原因吗


共 (1) 个答案

  1. # 1 楼答案

    也许你应该下载并安装scala:http://www.scala-lang.org/download/,用export SCALA_HOME=${your scala path}设置scala_HOME,用export PATH=${SCALA_HOME}/bin:$PATH配置PATH