时间戳不兼容(NiFi的PutSQL)

2024-09-30 20:28:04 发布

您现在位置:Python中文网/ 问答频道 /正文

我遇到了使用Nifi PutSQL处理器将时间戳插入PostgreSQL数据库的问题。在

更具体地说,当尝试在timestamptz列中插入格式为“2018-01-31T19:01:09+00:00”的日期时,我得到以下错误消息:

2018-04-01 19:29:40,091 ERROR [Timer-Driven Process Thread-5] o.apache.nifi.processors.standard.PutSQL PutSQL[id=7997503a-0162-1000-ee81-a0361cad5e0c] Failed to update database for StandardFlowFileRecord[uuid=d02e8b39-e564-4c37-a08a-dab8931e9890,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1522615075930-15, container=default, section=15], offset=11492, length=163],offset=0,name=32836401373126,size=163] due to java.sql.SQLDataException: The value of the sql.args.5.value is '2018-01-31T20:19:35+00:00', which cannot be converted to a timestamp; routing to failure: java.sql.SQLDataException: The value of the sql.args.5.value is '2018-01-31T20:19:35+00:00', which cannot be converted to a timestamp
java.sql.SQLDataException: The value of the sql.args.5.value is '2018-01-31T20:19:35+00:00', which cannot be converted to a timestamp
    at org.apache.nifi.processors.standard.PutSQL.setParameters(PutSQL.java:711)
    at org.apache.nifi.processors.standard.PutSQL.lambda$null$5(PutSQL.java:313)
    at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
    at org.apache.nifi.processors.standard.PutSQL.lambda$new$6(PutSQL.java:311)
    at org.apache.nifi.processors.standard.PutSQL.lambda$new$9(PutSQL.java:354)
    at org.apache.nifi.processor.util.pattern.PutGroup.putFlowFiles(PutGroup.java:91)
    at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:101)
    at org.apache.nifi.processors.standard.PutSQL.lambda$onTrigger$20(PutSQL.java:574)
    at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
    at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
    at org.apache.nifi.processors.standard.PutSQL.onTrigger(PutSQL.java:574)
    at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
    at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
    at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
    at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.text.ParseException: Unparseable date: "2018-01-31T20:19:35+00:00"
    at java.text.DateFormat.parse(DateFormat.java:366)
    at org.apache.nifi.processors.standard.PutSQL.setParameter(PutSQL.java:911)
    at org.apache.nifi.processors.standard.PutSQL.setParameters(PutSQL.java:707)
    ... 21 common frames omitted

我已经测试了从命令行将'2018-01-31T19:01:09+00:00'插入到timestatz列中,它工作得非常好。我尝试过各种不同的格式,例如:

  • ‘2018-01-31 19:01:09+00:00’
  • ‘2018-01-31 19:01:09+00’
  • ‘2018-01-31T19:01:09+00’
  • ‘2018-01-31 19:01:09’

在Nifi中,它们都会失败,并出现相同的错误,即使在从命令行执行INSERT时,它们都可以很好地插入。在

请找到我的流量截图附件。如果你需要更多的细节请告诉我。在

Part of NiFi flow

老实说,我更希望避免java转换,因为将datetime保留为字符串并直接将其插入到Postgres数据库中就可以了。我尝试过使用UpdateAttribute处理器强制执行此操作,但这会导致其他错误。在

关于这个话题,我遇到了各种各样的问题,但我还是不知道到底发生了什么。最值得注意的是:


Tags: toorgsqlvalueapacheutiljavaprocessor
2条回答

您可以在PutSQL之前使用UpdateAttribute,以及toDate()和{a2}表达式语言函数,将时间戳值转换为数据库可以接受的值。在

或者,您可以通过使用PutDatabaseRecord跳过SplitText、ConvertJSONToSQL和PutSQL步骤,也可以配置一个RecordReader,它接受您的时间戳格式并进行相应的转换。如果这样做有效,这是一个更好的解决方案,因为它将一次性处理整个流文件(而不是单个行)

我通过使用ExecuteStreamCommand处理器来解决这个问题,该处理器调用一个python脚本,该脚本将JSON行转换成它各自的sqlinsert语句。本例中的关注表是reddit_post。在

python脚本的代码(我知道不需要INSERT参数,但这是因为我计划稍后添加一个UPDATE选项):

import json
import argparse
import sys

# For command line arguments
parser = argparse.ArgumentParser(description='Converts JSON to respective SQL statement')
parser.add_argument('statement_type', type=str, nargs=1)
parser.add_argument('table_name', type=str, nargs=1)

# Reading the command line arguments
statement_type = parser.parse_args().statement_type[0]
table_name = parser.parse_args().table_name[0]

# Initialize SQL statement 
statement = ''

for line in sys.stdin:
  # Load JSON line
  json_line = json.loads(line)

  # Add table name and SQL syntax
  if statement_type == 'INSERT':
    statement += 'INSERT INTO {} '.format(table_name)

  # Add table parameters and SQL syntax
  statement += '({}) '.format(', '.join(json_line.keys()))

  # Add table values and SQL syntax
  # Note that strings are formatted with single quotes, other data types are converted to strings (for the join method)
  statement += "VALUES ({});".format(', '.join("'{0}'".format(value.replace("'", "''")) if type(value) == str else str(value) for value in json_line.values()))

  # Send statement to stdout
  print(statement)
​

ExecuteStreamCommand的配置(请注意,参数Delimeter设置为一个空格): enter image description here

流程片段:
enter image description here

我希望这能对遇到类似问题的人有所帮助。如果你有任何关于如何改进脚本,流程,或其他任何建议,请不要犹豫让我知道!在

相关问题 更多 >