java SparkSql不支持日期格式
我尝试将日期文件与sparkSql一起使用,但它不起作用
我试图添加DateColumn dob
在Person类中,我添加了
dob作为日期的设置器和获取器
当试图执行时
SELECT dob,name,age,count(*) as totalCount FROM Person WHERE dob >= '1995-01-01' AND age <= '2014-02-01';
还尝试在查询中使用between,而不是<;=&>;=也
/Volumes/Official/spark-1.0.2-bin-hadoop2$: bin/spark-submit --class "SimpleApp" --master local[4] try/simple-project/target/simple-project-1.0.jar
Spark assembly has been built with Hive, including Datanucleus jars on classpath
2014-08-21 11:42:47.360 java[955:1903] Unable to load realm mapping info from SCDynamicStore
=== Data source: RDD ===
Exception in thread "main" scala.MatchError: class java.util.Date (of class java.lang.Class)
# 1 楼答案
它仍然处于挂起状态,而不是
Date
您可以在Person
类中使用Timestamp
SPARK-2552我们将不得不等待一段时间,直到1.2.0版本
详情: