有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

java Spark正在抛出不满意的LinkError错误

我正在运行一个集群流浪设置,在主机器和从机器上安装了ubuntu 14.04和java 8。我的集群成功启动,从机可以连接,但我没有运行hadoop。相反,我运行的是spark 1.2.1的独立版本

然后,我复制了基本的SparkPi示例,并用以下pom编译了它:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>dev.quant</groupId>
  <artifactId>neural-spark</artifactId>
  <version>1.0-SNAPSHOT</version>
  <name>${project.artifactId}</name>
  <description>My wonderfull scala app</description>
  <inceptionYear>2010</inceptionYear>
  <licenses>
    <license>
      <name>My License</name>
      <url>http://....</url>
      <distribution>repo</distribution>
    </license>
  </licenses>

  <properties>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
    <encoding>UTF-8</encoding>
    <scala.tools.version>2.10</scala.tools.version>
    <scala.version>2.10.4</scala.version>
  </properties>

  <dependencies>
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <version>${scala.version}</version>
    </dependency>

    <!-- Test -->
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.specs2</groupId>
      <artifactId>specs2_2.10</artifactId>
      <version>3.0-M1</version>
    </dependency>
    <dependency>
      <groupId>org.scalatest</groupId>
      <artifactId>scalatest_2.10</artifactId>
      <version>3.0.0-SNAP4</version>
    </dependency>
      <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-core_2.11</artifactId>
          <version>1.2.0</version>
      </dependency>
  </dependencies>

  <build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <testSourceDirectory>src/test/scala</testSourceDirectory>
    <plugins>
        <plugin>
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <executions>
                <execution>
                    <id>scala-compile-first</id>
                    <phase>process-resources</phase>
                    <goals>
                        <goal>add-source</goal>
                        <goal>compile</goal>
                    </goals>
                </execution>
                <execution>
                    <id>scala-test-compile</id>
                    <phase>process-test-resources</phase>
                    <goals>
                        <goal>testCompile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.13</version>
        <configuration>
          <useFile>false</useFile>
          <disableXmlReport>true</disableXmlReport>
          <!-- If you have classpath issue like NoDefClassError,... -->
          <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
          <includes>
            <include>**/*Test.*</include>
            <include>**/*Suite.*</include>
          </includes>
        </configuration>
      </plugin>
        <plugin>
            <artifactId>maven-assembly-plugin</artifactId>
            <configuration>
                <archive>
                    <manifest>
                        <mainClass>dev.quant.App</mainClass>
                    </manifest>
                </archive>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
        </plugin>
    </plugins>
  </build>
</project>

哪个有效?>;mvn-U clean scala:编译程序集:单个

我运行的程序是:

package dev.quant

import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.SparkContext._

import scala.math._

/**
 * @author ${user.name}
 */

object App {

  def foo(x : Array[String]) = x.foldLeft("")((a,b) => a + b)

  def main(args : Array[String]) {
    val conf = new SparkConf().setAppName("Spark Pi").setMaster("spark://10.0.0.2:7077").set("spark.executor.memory",".5g")
    val spark = new SparkContext(conf)
    val slices = 20
    val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow
    val count = spark.parallelize(1 until n, slices).map { i =>
        val x = random * 2 - 1
        val y = random * 2 - 1
        if (x*x + y*y < 1) 1 else 0
      }.reduce(_ + _)
    println("Pi is roughly " + 4.0 * count / n)
    spark.stop()
  }

}

因此,基本上在跑步之后->;mvn scala:run-DmainClass=dev.quant。应用程序我得到以下错误提示

15/03/04 22:46:15 INFO SecurityManager: Changing view acls to: dev
15/03/04 22:46:15 INFO SecurityManager: Changing modify acls to: dev
15/03/04 22:46:15 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(dev); users with modify permissions: Set(dev)
15/03/04 22:46:16 INFO Slf4jLogger: Slf4jLogger started
15/03/04 22:46:16 INFO Remoting: Starting remoting
15/03/04 22:46:16 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.0.2:50662]
15/03/04 22:46:16 INFO Utils: Successfully started service 'sparkDriver' on port 50662.
15/03/04 22:46:16 INFO SparkEnv: Registering MapOutputTracker
15/03/04 22:46:16 INFO SparkEnv: Registering BlockManagerMaster
15/03/04 22:46:16 INFO DiskBlockManager: Created local directory at /tmp/spark-b8515bff-2915-4bc2-a917-fdb7c11849b5/spark-3527d111-3aac-4378-b493-17c92b394018
15/03/04 22:46:16 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1873)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
    at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:308)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:240)
    at dev.quant.App$.main(App.scala:18)
    at dev.quant.App.main(App.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:64)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
    at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:283)
    at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:44)
    at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:214)
    at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
    ... 15 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
    ... 22 more
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
    at org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native Method)
    at org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
    at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)
    ... 27 more

我也尝试过通过提交jar/spark submit/path/to/my jar,但没有用。我以前从未见过这个错误,但我的第一印象是JniBasedUnixGroupsMappingWithFallback是我的binary spark发行版所依赖的某个java库,但java 8没有它。不管怎样,如果你们知道可能是什么,请告诉我


共 (1) 个答案