You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#. Install the ``google`` package first, like so: ``pip install 'apache-airflow[google]'``.
200
-
#. Make sure a Google Cloud Platform connection hook has been defined in Airflow. The hook should have read and write access to the Google Cloud Storage bucket defined above in ``remote_base_log_folder``.
198
+
#. By default Application Default Credentials are used to obtain credentials. You can also
199
+
set ``google_key_path`` option in ``[logging]`` section, if you want to use your own service account.
200
+
#. Make sure a Google Cloud Platform account have read and write access to the Google Cloud Storage bucket defined above in ``remote_base_log_folder``.
201
+
#. Install the ``google`` package, like so: ``pip install 'apache-airflow[google]'``.
201
202
#. Restart the Airflow webserver and scheduler, and trigger (or wait for) a new task execution.
202
203
#. Verify that logs are showing up for newly executed tasks in the bucket you've defined.
203
204
#. Verify that the Google Cloud Storage viewer is working in the UI. Pull up a newly executed task, and verify that you see something like:
@@ -311,7 +312,7 @@ For integration with Stackdriver, this option should start with ``stackdriver://
311
312
The path section of the URL specifies the name of the log e.g. ``stackdriver://airflow-tasks`` writes
312
313
logs under the name ``airflow-tasks``.
313
314
314
-
You can set ``stackdriver_key_path`` option in the ``[logging]`` section to specify the path to `the service
315
+
You can set ``google_key_path`` option in the ``[logging]`` section to specify the path to `the service
0 commit comments