初始化sparkContext python时出现奇怪错误

2024-05-02 17:23:10 发布

您现在位置:Python中文网/ 问答频道 /正文

我一直在使用spark2.0.1,但是试图升级到更新的版本,即2.1.1,方法是将tar文件下载到本地并更改路径。在

但是,现在当我尝试运行任何程序时,它在sparkContext初始化时失败。i、 e

    sc = SparkContext()

我尝试运行的整个示例代码是:

^{pr2}$

我得到的例外是在开始时,即:


    Traceback (most recent call last):
      File "/home/vna/scripts/global_score_pipeline/test_code_here.py", line 47, in 
        sc = SparkContext()
      File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/pyspark/context.py", line 118, in __init__
        conf, jsc, profiler_cls)
      File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/pyspark/context.py", line 182, in _do_init
        self._jsc = jsc or self._initialize_context(self._conf._jconf)
      File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/pyspark/context.py", line 249, in _initialize_context
        return self._jvm.JavaSparkContext(jconf)
      File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in __call__
      File "/opt/apps/spark-2.1.1-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
    py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
    : java.lang.NumberFormatException: For input string: "Ubuntu"
        at java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)

我不会在我的变量或ENV变量中传递Ubuntu。。在

我也尝试过更改sc=SparkContext(master='local'),但问题还是一样。在

请帮助确定此问题

编辑:火花的内容-默认值.conf在


    spark.master                     spark://master:7077
    # spark.eventLog.enabled           true
    # spark.eventLog.dir               hdfs://namenode:8021/directory
    spark.serializer                 org.apache.spark.serializer.KryoSerializer
    spark.driver.memory              8g
    spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
    spark.driver.extraClassPath /opt/apps/spark-2.1.1-bin-hadoop2.7/jars/mysql-connector-java-5.1.35-bin.jar
    spark.executor.extraClassPath /opt/apps/spark-2.1.1-bin-hadoop2.7/jars/mysql-connector-java-5.1.35-bin.jar


Tags: appsinpyselfbincontextlinejava
1条回答
网友
1楼 · 发布于 2024-05-02 17:23:10

您是否检查过配置文件(例如spark-defaults.conf)?对于需要整数的字段,这可能是解析错误。例如,如果您尝试设置spark.executor.cores Ubuntu,您可能会得到该异常。在

相关问题 更多 >