MySqlToGoogleCloudStorageOperator意外失败

时间:2018-10-08 09:18:37

标签: airflow

我有以下代码:

file_name = gcs_export_uri_template + '/' + TABLE_PREFIX + '_' + TABLE_NAME + '{}.json'  #{} is required for the operator. if file is big it breakes it to more files as 1.json 2.json etc
import_orders_op = MySqlToGoogleCloudStorageOperator(
    task_id='import_orders',
    mysql_conn_id='sqlcon',
    google_cloud_storage_conn_id='gcpcon',
    provide_context=True,
    sql=""" SELECT * FROM {{ params.table_name }} WHERE orders_id > {{ params.last_imported_id }} AND orders_id < {{ ti.xcom_pull('get_max_order_id') }} limit 10 """,
    params={'last_imported_id': LAST_IMPORTED_ORDER_ID, 'table_name' :  TABLE_NAME},
    bucket=GCS_BUCKET_ID,
    filename=file_name,
    dag=dag) 

这很好。但是请注意,查询具有limit 10 当我将其删除为:

sql=""" SELECT * FROM {{ params.table_name }} WHERE orders_id > {{ params.last_imported_id }} AND orders_id < {{ ti.xcom_pull('get_max_order_id') }} """,

它失败并显示:

[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask: Traceback (most recent call last):
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask:   File "/usr/local/bin/airflow", line 27, in <module>
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask:     args.func(args)
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask:   File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 392, in run
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask:     pool=args.pool,
[2018-10-08 09:09:38,830] {base_task_runner.py:98} INFO - Subtask:   File "/usr/local/lib/python2.7/dist-packages/airflow/utils/db.py", line 50, in wrapper
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask:     result = func(*args, **kwargs)
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask:   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 1493, in _run_raw_task
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask:     result = task_copy.execute(context=context)
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask:   File "/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/mysql_to_gcs.py", line 89, in execute
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask:     files_to_upload = self._write_local_data_files(cursor)
[2018-10-08 09:09:38,831] {base_task_runner.py:98} INFO - Subtask:   File "/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/mysql_to_gcs.py", line 134, in _write_local_data_files
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask:     json.dump(row_dict, tmp_file_handle)
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask:   File "/usr/lib/python2.7/json/__init__.py", line 189, in dump
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask:     for chunk in iterable:
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask:   File "/usr/lib/python2.7/json/encoder.py", line 434, in _iterencode
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask:     for chunk in _iterencode_dict(o, _current_indent_level):
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask:   File "/usr/lib/python2.7/json/encoder.py", line 390, in _iterencode_dict
[2018-10-08 09:09:38,832] {base_task_runner.py:98} INFO - Subtask:     yield _encoder(value)
[2018-10-08 09:09:38,833] {base_task_runner.py:98} INFO - Subtask: UnicodeDecodeError: 'utf8' codec can't decode byte 0xa0 in position 5: invalid start byte

我只能假设原因是带有{}.json的file_name,如果它有太多记录并且需要拆分文件而不能呢?

我正在运行Airflow 1.9.0

这是什么问题?

1 个答案:

答案 0 :(得分:1)

您的限制10恰好返回了干净的10行明确的ASCII编码。但是,较大的选择返回的内容不是使用UTF-8解码的。当我的MySQL Connection没有设置任何额外功能时,我就拥有了这个。

如果根本没有附加功能,请编辑连接以在附加功能字段中添加{"charset": "utf8"}。如果您有其他功能,只需将该键值对添加到集合中即可。

这应该为钩子用来检索记录的MySQL客户端建立编码,事情应该开始正确解码。他们是否会写信给GCS是您的一项练习。

相关问题