我正在从Composer-> airflow运行数据流管道。当我运行以下代码时:
(table_output | beam.io.Write(
beam.io.BigQuerySink(
<dataset.table>,
<schema>,
create_disposition = beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
write_disposition='TRUNCATE')))
此操作失败,并显示以下错误: AttributeError:模块对象没有属性存储
我想念什么?这是堆栈跟踪:
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 766, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 482, in _load_main_session
pickler.load_session(session_file)
File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 254, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module
return getattr(__import__(module, None, None, [obj]), obj)
AttributeError: \'module\' object has no attribute \'storage\'