有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

用于Hbase的Mapreduce的java NoSuchMethodError

我想在我的Hbase 2.1.1实例上启动MapReducer作业。 当我执行以下代码时,出现以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapreduce.Job.getArchiveSharedCacheUploadPolicies(Lorg/apache/hadoop/conf/Configuration;)Ljava/util/Map;
    at org.apache.hadoop.mapreduce.v2.util.MRApps.setupDistributedCache(MRApps.java:491)
    at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:92)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:172)
    at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:788)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:240)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
    at MapReduce.main(MapReduce.java:49)

我真的不知道我做错了什么

公共类MapReduce{

public static void main(String[] args) {
    String tableName = "actors";
    Configuration config = HBaseConfiguration.create();

    try {

    Job job = new Job(config);
    job.setJarByClass(MapReduce.class);     // class that contains mapper and reducer

    Scan scan = new Scan();
    scan.setCaching(500);        // 1 is the default in Scan, which will be bad for MapReduce jobs
    scan.setCacheBlocks(false);  // don't set to true for MR jobs
    // set other scan attrs


    TableMapReduceUtil.initTableMapperJob(
            "actors",        // input table
            scan,               // Scan instance to control CF and attribute selection
            MyMapper.class,     // mapper class
            Text.class,         // mapper output key
            IntWritable.class,  // mapper output value
            job);
    TableMapReduceUtil.initTableReducerJob("actors", MyReducer.class, job);
    job.setNumReduceTasks(1);    // at least one, adjust as required
    FileOutputFormat.setOutputPath(job, new Path("/home/marcel/Desktop/MapReducerOutput"));  // adjust directories as required

    System.out.println("bis hierhin gut");

    boolean b;
    b = job.waitForCompletion(true);
    if (!b) {
        throw new IOException("error with job!");
    }
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (ClassNotFoundException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (InterruptedException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }

}

}

/------------------------------------编辑----------------------------------

对于任何有相同问题的人来说,问题似乎在于我的依赖性。我通过在pom中使用以下命令修复了它。xml文件:

<dependencies>
  <dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.9.2</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.9.2</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.9.2</version>
    <scope>provided</scope>
</dependency>

<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase-client</artifactId>
    <version>1.4.0</version>
</dependency>

<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase-server</artifactId>
    <version>1.4.0</version>
</dependency>

</dependencies>

共 (0) 个答案