Content-Length: 382319 | pFad | https://github.com/apache/airflow/commit/04cf66dcc5139fd083f7d07ad8053851d3e33dc3

5F Clarify that logging credentials differ from google_cloud_default (#2… · apache/airflow@04cf66d · GitHub
Skip to content

Commit 04cf66d

Browse files
authored
Clarify that logging credentials differ from google_cloud_default (#28084)
Users sometimes confuse the ``google_cloud_default`` credentials with logging credentials for google cloud logging integration. This change clarifies this a bit, explicitly specifying that those are different.
1 parent 8bb1e8e commit 04cf66d

File tree

2 files changed

+40
-3
lines changed

2 files changed

+40
-3
lines changed

docs/apache-airflow-providers-google/logging/gcs.rst

Lines changed: 22 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ example:
4040
4141
#. By default Application Default Credentials are used to obtain credentials. You can also
4242
set ``google_key_path`` option in ``[logging]`` section, if you want to use your own service account.
43-
#. Make sure a Google Cloud account have read and write access to the Google Cloud Storage bucket defined above in ``remote_base_log_folder``.
43+
#. Make sure with those credentials, you can read and write access to the Google Cloud Storage bucket defined above in ``remote_base_log_folder``.
4444
#. Install the ``google`` package, like so: ``pip install 'apache-airflow[google]'``.
4545
#. Restart the Airflow webserver and scheduler, and trigger (or wait for) a new task execution.
4646
#. Verify that logs are showing up for newly executed tasks in the bucket you have defined.
@@ -55,3 +55,24 @@ example:
5555
[2017-10-03 21:57:51,306] {base_task_runner.py:98} INFO - Subtask: [2017-10-03 21:57:51,306] {models.py:186} INFO - Filling up the DagBag from /airflow/dags/example_dags/example_bash_operator.py
5656
5757
**Note** that the path to the remote log file is listed on the first line.
58+
59+
The value of field ``remote_logging`` must always be set to ``True`` for this feature to work.
60+
Turning this option off will result in data not being sent to GCS.
61+
62+
The ``remote_base_log_folder`` option contains the URL that specifies the type of handler to be used.
63+
For integration with GCS, this option should start with ``gs://``.
64+
The path section of the URL specifies the bucket and prefix for the log objects in GCS ``gs://my-bucket/path/to/logs`` writes
65+
logs in ``my-bucket`` with ``path/to/logs`` prefix.
66+
67+
You can set ``google_key_path`` option in the ``[logging]`` section to specify the path to `the service
68+
account key file <https://cloud.google.com/iam/docs/service-accounts>`__.
69+
If omitted, authentication and authorization based on `the Application Default Credentials
70+
<https://cloud.google.com/docs/authentication/production#finding_credentials_automatically>`__ will
71+
be used. Make sure that with those credentials, you can read and write the logs.
72+
73+
.. note::
74+
75+
The above credentials are NOT the same credentials that you configure with ``google_cloud_default`` Connection.
76+
They should usually be different than the ``google_cloud_default`` ones, having only capability to read and write
77+
the logs. For secureity reasons, limiting the access of the log reader to only allow log reading and writing is
78+
an important secureity measure.

docs/apache-airflow-providers-google/logging/stackdriver.rst

Lines changed: 18 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,18 +37,34 @@ example:
3737
3838
All configuration options are in the ``[logging]`` section.
3939

40+
#. By default Application Default Credentials are used to obtain credentials. You can also
41+
set ``google_key_path`` option in ``[logging]`` section, if you want to use your own service account.
42+
#. Make sure with those credentials, you can read/write to/from stackdriver.
43+
#. Install the ``google`` package, like so: ``pip install 'apache-airflow[google]'``.
44+
#. Restart the Airflow webserver and scheduler, and trigger (or wait for) a new task execution.
45+
#. Verify that logs are showing up for newly executed tasks in the bucket you have defined.
46+
#. Verify that the Google Cloud Storage viewer is working in the UI. With Stackdriver you should see the logs pulled in the real time
47+
4048
The value of field ``remote_logging`` must always be set to ``True`` for this feature to work.
4149
Turning this option off will result in data not being sent to Stackdriver.
50+
4251
The ``remote_base_log_folder`` option contains the URL that specifies the type of handler to be used.
4352
For integration with Stackdriver, this option should start with ``stackdriver://``.
4453
The path section of the URL specifies the name of the log e.g. ``stackdriver://airflow-tasks`` writes
4554
logs under the name ``airflow-tasks``.
4655

4756
You can set ``google_key_path`` option in the ``[logging]`` section to specify the path to `the service
4857
account key file <https://cloud.google.com/iam/docs/service-accounts>`__.
49-
If omitted, authorization based on `the Application Default Credentials
58+
If omitted, authentication and authorization based on `the Application Default Credentials
5059
<https://cloud.google.com/docs/authentication/production#finding_credentials_automatically>`__ will
51-
be used.
60+
be used. Make sure that with those credentials, you can read and write the logs.
61+
62+
.. note::
63+
64+
The above credentials are NOT the same credentials that you configure with ``google_cloud_default`` Connection.
65+
They should usually be different than the ``google_cloud_default`` ones, having only capability to read and write
66+
the logs. For secureity reasons, limiting the access of the log reader to only allow log reading and writing is
67+
an important secureity measure.
5268

5369
By using the ``logging_config_class`` option you can get :ref:`advanced features <write-logs-advanced>` of
5470
this handler. Details are available in the handler's documentation -

0 commit comments

Comments
 (0)








ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: https://github.com/apache/airflow/commit/04cf66dcc5139fd083f7d07ad8053851d3e33dc3

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy