Skip to content

Commit 9a623e9

Browse files
authored
migrate system test gcs_to_bigquery into new design (#22753)
1 parent 49e336a commit 9a623e9

File tree

3 files changed

+32
-44
lines changed

3 files changed

+32
-44
lines changed

docs/apache-airflow-providers-google/operators/cloud/gcs.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ Use the
4141
:class:`~airflow.providers.google.cloud.transfers.gcs_to_bigquery.GCSToBigQueryOperator`
4242
to execute a BigQuery load job.
4343

44-
.. exampleinclude:: /../../airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py
44+
.. exampleinclude:: /../../tests/system/providers/google/gcs/example_gcs_to_bigquery.py
4545
:language: python
4646
:dedent: 4
4747
:start-after: [START howto_operator_gcs_to_bigquery]

tests/providers/google/cloud/transfers/test_gcs_to_bigquery_system.py

Lines changed: 0 additions & 36 deletions
This file was deleted.

airflow/providers/google/cloud/example_dags/example_gcs_to_bigquery.py renamed to tests/system/providers/google/gcs/example_gcs_to_bigquery.py

Lines changed: 31 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -29,19 +29,24 @@
2929
BigQueryDeleteDatasetOperator,
3030
)
3131
from airflow.providers.google.cloud.transfers.gcs_to_bigquery import GCSToBigQueryOperator
32+
from airflow.utils.trigger_rule import TriggerRule
3233

33-
DATASET_NAME = os.environ.get("GCP_DATASET_NAME", 'airflow_test')
34-
TABLE_NAME = os.environ.get("GCP_TABLE_NAME", 'gcs_to_bq_table')
34+
ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
35+
DAG_ID = "gcs_to_bigquery_operator"
36+
37+
DATASET_NAME = f"dataset_{DAG_ID}_{ENV_ID}"
38+
TABLE_NAME = "test"
39+
PROJECT_ID = os.environ.get("SYSTEM_TESTS_GCP_PROJECT")
3540

3641
with models.DAG(
37-
dag_id='example_gcs_to_bigquery_operator',
42+
dag_id=DAG_ID,
43+
schedule_interval='@once',
3844
start_date=datetime(2021, 1, 1),
3945
catchup=False,
40-
schedule_interval='@once',
41-
tags=['example'],
46+
tags=['example', "gcs"],
4247
) as dag:
4348
create_test_dataset = BigQueryCreateEmptyDatasetOperator(
44-
task_id='create_airflow_test_dataset', dataset_id=DATASET_NAME
49+
task_id='create_airflow_test_dataset', dataset_id=DATASET_NAME, project_id=PROJECT_ID
4550
)
4651

4752
# [START howto_operator_gcs_to_bigquery]
@@ -62,6 +67,25 @@
6267
task_id='delete_airflow_test_dataset',
6368
dataset_id=DATASET_NAME,
6469
delete_contents=True,
70+
trigger_rule=TriggerRule.ALL_DONE,
71+
)
72+
73+
(
74+
# TEST SETUP
75+
create_test_dataset
76+
# TEST BODY
77+
>> load_csv
78+
# TEST TEARDOWN
79+
>> delete_test_dataset
6580
)
6681

67-
create_test_dataset >> load_csv >> delete_test_dataset
82+
from tests.system.utils.watcher import watcher
83+
84+
# This test needs watcher in order to properly mark success/failure
85+
# when "tearDown" task with trigger rule is part of the DAG
86+
list(dag.tasks) >> watcher()
87+
88+
from tests.system.utils import get_test_run # noqa: E402
89+
90+
# Needed to run the example DAG with pytest (see: tests/system/README.md#run_via_pytest)
91+
test_run = get_test_run(dag)

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy