Content-Length: 317357 | pFad | http://github.com/apache/airflow/pull/50691/commits/b9f9aa5a8f935d08cd6a244a6b7879a84e900b5d

FC [v3-0-test] Fix Pydantic ``ForwardRef`` error by reordering discriminated union definitions (#50688) by github-actions[bot] · Pull Request #50691 · apache/airflow · GitHub
Skip to content

[v3-0-test] Fix Pydantic ForwardRef error by reordering discriminated union definitions (#50688) #50691

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 20, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
[v3-0-test] Fix Pydantic ForwardRef error by reordering discrimin…
…ated union definitions (#50688)

In some rare cases, the DAG parse subprocess (`DagFileProcessorProcess`) could raise a `PydanticUserError` related to unresolved ForwardRefs in the discriminated union ToDagProcessor.

This was caused by `TypeAdapter[ToDagProcessor]` being instantiated before `DagFileParseRequest` was defined. While this is not always reproducible, it can happen in forked subprocesses depending on import order.

This change moves the union definitions (`ToDagProcessor`, `ToManager`) after the relevant Pydantic models are declared, ensuring all references are fully resolved at definition time.

closes #50530
(cherry picked from commit 672ec99)

Co-authored-by: Kaxil Naik <kaxilnaik@gmail.com>
  • Loading branch information
kaxil committed May 19, 2025
commit b9f9aa5a8f935d08cd6a244a6b7879a84e900b5d
71 changes: 36 additions & 35 deletions airflow-core/src/airflow/dag_processing/processor.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,13 +54,47 @@
from airflow.sdk.definitions.context import Context
from airflow.typing_compat import Self


class DagFileParseRequest(BaseModel):
"""
Request for DAG File Parsing.

This is the request that the manager will send to the DAG parser with the dag file and
any other necessary metadata.
"""

file: str

bundle_path: Path
"""Passing bundle path around lets us figure out relative file path."""

requests_fd: int
callback_requests: list[CallbackRequest] = Field(default_factory=list)
type: Literal["DagFileParseRequest"] = "DagFileParseRequest"


class DagFileParsingResult(BaseModel):
"""
Result of DAG File Parsing.

This is the result of a successful DAG parse, in this class, we gather all serialized DAGs,
import errors and warnings to send back to the scheduler to store in the DB.
"""

fileloc: str
serialized_dags: list[LazyDeserializedDAG]
warnings: list | None = None
import_errors: dict[str, str] | None = None
type: Literal["DagFileParsingResult"] = "DagFileParsingResult"


ToManager = Annotated[
Union["DagFileParsingResult", GetConnection, GetVariable, PutVariable, DeleteVariable],
Union[DagFileParsingResult, GetConnection, GetVariable, PutVariable, DeleteVariable],
Field(discriminator="type"),
]

ToDagProcessor = Annotated[
Union["DagFileParseRequest", ConnectionResult, VariableResult, ErrorResponse, OKResponse],
Union[DagFileParseRequest, ConnectionResult, VariableResult, ErrorResponse, OKResponse],
Field(discriminator="type"),
]

Expand Down Expand Up @@ -182,39 +216,6 @@ def _execute_dag_callbacks(dagbag: DagBag, request: DagCallbackRequest, log: Fil
Stats.incr("dag.callback_exceptions", tags={"dag_id": request.dag_id})


class DagFileParseRequest(BaseModel):
"""
Request for DAG File Parsing.

This is the request that the manager will send to the DAG parser with the dag file and
any other necessary metadata.
"""

file: str

bundle_path: Path
"""Passing bundle path around lets us figure out relative file path."""

requests_fd: int
callback_requests: list[CallbackRequest] = Field(default_factory=list)
type: Literal["DagFileParseRequest"] = "DagFileParseRequest"


class DagFileParsingResult(BaseModel):
"""
Result of DAG File Parsing.

This is the result of a successful DAG parse, in this class, we gather all serialized DAGs,
import errors and warnings to send back to the scheduler to store in the DB.
"""

fileloc: str
serialized_dags: list[LazyDeserializedDAG]
warnings: list | None = None
import_errors: dict[str, str] | None = None
type: Literal["DagFileParsingResult"] = "DagFileParsingResult"


def in_process_api_server() -> InProcessExecutionAPI:
from airflow.api_fastapi.execution_api.app import InProcessExecutionAPI

Expand Down








ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: http://github.com/apache/airflow/pull/50691/commits/b9f9aa5a8f935d08cd6a244a6b7879a84e900b5d

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy