java ElasticSearch 2.4.1通过Spark返回空数据集
我试图通过Spark读取弹性搜索2.4.1中的数据。它正确地识别了模式,但没有读取行。下面是依赖关系
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>commons-httpclient</groupId>
<artifactId>commons-httpclient</artifactId>
<version>3.1</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark_2.11</artifactId>
<version>2.4.1</version>
</dependency>
</dependencies>
下面是我的代码
“'
SparkSession spark = SparkSession.builder().appName("An1").master("local[*]").getOrCreate();
Dataset<Row> usersDF = spark.read().format("org.elasticsearch.spark.sql").option("spark.serializer", "org.apache.spark.serializer.KyroSerializer").option("es.port","9200").option("es.nodes","localhost").load("abc/asset");
System.out.println(usersDF.count());
usersDF.show();
“'
共 (0) 个答案