-
Notifications
You must be signed in to change notification settings - Fork 15.1k
Use dynamic task mapping in TriggerDagRunOperator may generate the same run_id #28868
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for opening your first issue here! Be sure to follow the issue template! |
I guess that we can check if the task is dynamic mapping (maybe by checking |
The real problem is not actually the run ID, but the logical date (execution date), which also has a unique constraint. Even if the run ID duplication is “fixed”, you still can’t schedule multiple runs under the same logical date. So that’s the thing you must explicitly pass to |
Hi @yenchenLiu, wondering if you were able to get TriggerDagRunOperator to run with dynamic task mapping while also passing different parameters that will be used in the child DAG that is executed. |
One possible approach would be to use Dataset to trigger the DAG instead. This would allow the scheduler to decide what the logical date (and run ID) is, and deduplicate the date appropriately. |
Apache Airflow version
2.5.0
What happened
I use TriggerDagRunOperator to generate a large number of dag instances(>= 100), and it sometimes causes a bug that is
What you think should happen instead
I expect the
run_id
to be always unique when I use dynamic task mapping to generate instances.How to reproduce
trigger_dag
.test_dag
, which runs TriggerDagRunOperator with the dynamic task to generate a large number oftrigger_dag
.run_id
.Operating System
Ubuntu 20.04.4 LTS
Versions of Apache Airflow Providers
No response
Deployment
Virtualenv installation
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: