有 Java 编程相关的问题?

你可以在下面搜索框中键入要查询的问题!

java试图使用SqoopOptions将数据从Mysql导入配置单元

Iam试图使用SqoopOptions将数据从Mysql导入配置单元,但出现以下错误:

ERROR tool.ImportTool: Imported Failed: Wrong FS: hdfs://localhost:8020/user/hive/warehouse/default/emp/_logs, expected: file:///

它正在将数据导入hdfs,但不在配置单元中

这是我的完整代码

import org.apache.sqoop.tool.ImportTool;
import com.cloudera.sqoop.SqoopOptions; 
public class App 
{
    public static void main( String[] args )
    {
        importToHive("emp");
    }
    /* CONSTANTS */
    private static final String JOB_NAME = "Sqoop Hive Job";
    private static final String MAPREDUCE_JOB = "Hive Map Reduce Job";
    private static final String DBURL ="jdbc:mysql://localhost:3306/sample";
    private static final String DRIVER = "com.mysql.jdbc.Driver";
    private static final String USERNAME = "root";
    private static final String PASSWORD = "cloudera";      
    private static final String HADOOP_HOME ="/usr/lib/hadoop-0.20-mapreduce";
    private static final String JAR_OUTPUT_DIR = "/tmp/sqoop/compile";
    private static final String HIVE_HOME = "/usr/lib/hive";
    private static final String HIVE_DIR = "/user/hive/warehouse/";
    private static final String WAREHOUSE_DIR = "hdfs://localhost:8020/user/hive/warehouse/default";
    private static final String SUCCESS = "SUCCESS !!!";
    private static final String FAIL = "FAIL !!!";

/*  **
* Imports data from RDBMS MySQL and uploads into Hive environment
*/ 

    public static void importToHive(String table){

       System.out.println("SqoopOptions loading .....");        

        /* MySQL connection parameters */
        SqoopOptions options = new SqoopOptions();
        options.setConnectString(DBURL);
        options.doOverwriteHiveTable();
        options.setTableName(table);
        options.setDriverClassName(DRIVER);
        options.setUsername(USERNAME);
        options.setPassword(PASSWORD);
        options.setHadoopMapRedHome(HADOOP_HOME);

        /* Hive connection parameters */
        options.setHiveHome(HIVE_HOME);
        options.setHiveImport(true);
        options.setHiveTableName("bsefmcgh");
        options.setOverwriteHiveTable(true);
        options.setFailIfHiveTableExists(false);
        //options.setFieldsTerminatedBy(',');
        options.setOverwriteHiveTable(true);
    options.setDirectMode(true);
    options.setNumMappers(1); // No. of Mappers to be launched for the job
    options.setWarehouseDir(WAREHOUSE_DIR);
    options.setJobName(JOB_NAME);
    options.setMapreduceJobName(MAPREDUCE_JOB);
    options.setTableName(table);
    options.setJarOutputDir(JAR_OUTPUT_DIR);        

    System.out.println("Import Tool running ....");
    ImportTool it = new ImportTool();
    int retVal = it.run((com.cloudera.sqoop.SqoopOptions) options);
    }
}

共 (1) 个答案

  1. # 1 楼答案

    我相信您不需要在warehouse dir选项中指定名称节点地址

    试试这个:

    private static final String WAREHOUSE_DIR = "/user/hive/warehouse/default";