有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

使用工具接口时java Hadoop 2.6.4异常

我正在用Java编写Hadoop的简单代码。当我尝试将它运行到jar文件并从CLI运行时,它正在工作。但当我尝试使用工具界面和ToolRunner时,我有一个例外:

Exception in thread "main" java.lang.NullPointerException
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
    at org.apache.hadoop.util.Shell.run(Shell.java:455)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:659)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:447)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:293)
    at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:145)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1297)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1294)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1315)
    at pl.flomedia.hadoop.Main.run(Main.java:52)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at pl.flomedia.hadoop.Main.main(Main.java:28)

这是我的代码(仅限配置):

public class Main extends Configured implements Tool {

    //mvn clean package antrun:run@deploy
    public static void main(String[] args) throws Exception {
        int res = ToolRunner.run(new Configuration(), new Main(), args);
        System.exit(res);
    }

    @Override
    public int run(String[] args) throws Exception {
        Configuration conf = this.getConf();
        conf.set("mapred.job.tracker", "hadoop-master:8021");
        conf.set("fs.default.name", "hdfs://hadoop-master:9000/user/vagrant");
        conf.set("hadoop.job.ugi", "vagrant");
        System.setProperty("HADOOP_USER_NAME", "vagrant");
        FileSystem fs = FileSystem.get(conf);
        System.out.println(fs.isDirectory(new Path("/user/vagrant")));
        //fs.mkdirs(new Path("input"));

        Job job = Job.getInstance(conf, "passent");
        job.setJarByClass(Main.class);
        job.setMapperClass(PasswordMapper.class);
        job.setReducerClass(EntrophyPassReducer.class);
        job.setSortComparatorClass(EntrophyDescComparator.class);
        job.setOutputKeyClass(DoubleWritable.class);
        job.setOutputValueClass(Text.class);
        FileInputFormat.addInputPath(job, new Path("input"));
        FileOutputFormat.setOutputPath(job, new Path("output"));
        return job.waitForCompletion(true) ? 0 : 1;
    }

}

有人能帮我吗?:) 提前谢谢


共 (0) 个答案