<p><code>GoogleCloudStorageToGoogleCloudStorageOperator</code>在v1.9.0中不可用,因此您必须从<a href="https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/operators/gcs_to_gcs.py" rel="nofollow noreferrer">here</a>复制文件,从{a2}复制相关的钩子,并将其粘贴到python环境中相应位置的flow文件夹中。按照以下步骤操作:</p>
<p>运行以下代码以查找Apache Airflow在计算机上的存储位置:</p>
<pre><code>pip show apache-airflow
</code></pre>
<p>它将在您的终端上产生以下输出:</p>
^{pr2}$
<p><strong>位置后的路径:</strong>是您的<strong>Apache flow</strong>目录</p>
<p>现在克隆git repo以获取这两个文件:</p>
<pre><code># Clone the git repo to `airflow-temp` folder
git clone https://github.com/apache/incubator-airflow airflow-temp
# Copy the hook from the cloned repo to where Apache Airflow is located
# Replace LINK_TO_SITE_PACKAGES_DIR with the path you found above
cp airflow-temp/airflow/contrib/hooks/gcs_hook.py LINK_TO_SITE_PACKAGES_DIR/airflow/contrib/hooks/
# For example: for me, it would be
cp airflow-temp/airflow/contrib/hooks/gcs_hook.py /Users/kaxil/anaconda2/lib/python2.7/site-packages/airflow/contrib/hooks/
# Do the same with operator file
cp airflow-temp/airflow/contrib/operators/gcs_to_gcs.py LINK_TO_SITE_PACKAGES_DIR/airflow/contrib/operators/
# For example: for me, it would be
cp airflow-temp/airflow/contrib/operators/gcs_to_gcs.py /Users/kaxil/anaconda2/lib/python2.7/site-packages/airflow/contrib/operators/
</code></pre>
<p>重新运行气流<code>webserver</code>和<code>scheduler</code>,现在应该可以工作了。在</p>