Browse Source
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>pull/13940/merge
committed by
GitHub
46 changed files with 9300 additions and 987 deletions
@ -0,0 +1,133 @@ |
|||
# Migrate from Pydantic v1 to Pydantic v2 { #migrate-from-pydantic-v1-to-pydantic-v2 } |
|||
|
|||
If you have an old FastAPI app, you might be using Pydantic version 1. |
|||
|
|||
FastAPI has had support for either Pydantic v1 or v2 since version 0.100.0. |
|||
|
|||
If you had installed Pydantic v2, it would use it. If instead you had Pydantic v1, it would use that. |
|||
|
|||
Pydantic v1 is now deprecated and support for it will be removed in the next versions of FastAPI, you should **migrate to Pydantic v2**. This way you will get the latest features, improvements, and fixes. |
|||
|
|||
/// warning |
|||
|
|||
Also, the Pydantic team stopped support for Pydantic v1 for the latest versions of Python, starting with **Python 3.14**. |
|||
|
|||
If you want to use the latest features of Python, you will need to make sure you use Pydantic v2. |
|||
|
|||
/// |
|||
|
|||
If you have an old FastAPI app with Pydantic v1, here I'll show you how to migrate it to Pydantic v2, and the **new features in FastAPI 0.119.0** to help you with a gradual migration. |
|||
|
|||
## Official Guide { #official-guide } |
|||
|
|||
Pydantic has an official <a href="https://docs.pydantic.dev/latest/migration/" class="external-link" target="_blank">Migration Guide</a> from v1 to v2. |
|||
|
|||
It also includes what has changed, how validations are now more correct and strict, possible caveats, etc. |
|||
|
|||
You can read it to understand better what has changed. |
|||
|
|||
## Tests { #tests } |
|||
|
|||
Make sure you have [tests](../tutorial/testing.md){.internal-link target=_blank} for your app and you run them on continuous integration (CI). |
|||
|
|||
This way, you can do the upgrade and make sure everything is still working as expected. |
|||
|
|||
## `bump-pydantic` { #bump-pydantic } |
|||
|
|||
In many cases, when you use regular Pydantic models without customizations, you will be able to automate most of the process of migrating from Pydantic v1 to Pydantic v2. |
|||
|
|||
You can use <a href="https://github.com/pydantic/bump-pydantic" class="external-link" target="_blank">`bump-pydantic`</a> from the same Pydantic team. |
|||
|
|||
This tool will help you to automatically change most of the code that needs to be changed. |
|||
|
|||
After this, you can run the tests and check if everything works. If it does, you are done. 😎 |
|||
|
|||
## Pydantic v1 in v2 { #pydantic-v1-in-v2 } |
|||
|
|||
Pydantic v2 includes everything from Pydantic v1 as a submodule `pydantic.v1`. |
|||
|
|||
This means that you can install the latest version of Pydantic v2 and import and use the old Pydantic v1 components from this submodule, as if you had the old Pydantic v1 installed. |
|||
|
|||
{* ../../docs_src/pydantic_v1_in_v2/tutorial001_an_py310.py hl[1,4] *} |
|||
|
|||
### FastAPI support for Pydantic v1 in v2 { #fastapi-support-for-pydantic-v1-in-v2 } |
|||
|
|||
Since FastAPI 0.119.0, there's also partial support for Pydantic v1 from inside of Pydantic v2, to facilitate the migration to v2. |
|||
|
|||
So, you could upgrade Pydantic to the latest version 2, and change the imports to use the `pydantic.v1` submodule, and in many cases it would just work. |
|||
|
|||
{* ../../docs_src/pydantic_v1_in_v2/tutorial002_an_py310.py hl[2,5,15] *} |
|||
|
|||
/// warning |
|||
|
|||
Have in mind that as the Pydantic team no longer supports Pydantic v1 in recent versions of Python, starting from Python 3.14, using `pydantic.v1` is also not supported in Python 3.14 and above. |
|||
|
|||
/// |
|||
|
|||
### Pydantic v1 and v2 on the same app { #pydantic-v1-and-v2-on-the-same-app } |
|||
|
|||
It's **not supported** by Pydantic to have a model of Pydantic v2 with its own fields defined as Pydantic v1 models or vice versa. |
|||
|
|||
```mermaid |
|||
graph TB |
|||
subgraph "❌ Not Supported" |
|||
direction TB |
|||
subgraph V2["Pydantic v2 Model"] |
|||
V1Field["Pydantic v1 Model"] |
|||
end |
|||
subgraph V1["Pydantic v1 Model"] |
|||
V2Field["Pydantic v2 Model"] |
|||
end |
|||
end |
|||
|
|||
style V2 fill:#f9fff3 |
|||
style V1 fill:#fff6f0 |
|||
style V1Field fill:#fff6f0 |
|||
style V2Field fill:#f9fff3 |
|||
``` |
|||
|
|||
...but, you can have separated models using Pydantic v1 and v2 in the same app. |
|||
|
|||
```mermaid |
|||
graph TB |
|||
subgraph "✅ Supported" |
|||
direction TB |
|||
subgraph V2["Pydantic v2 Model"] |
|||
V2Field["Pydantic v2 Model"] |
|||
end |
|||
subgraph V1["Pydantic v1 Model"] |
|||
V1Field["Pydantic v1 Model"] |
|||
end |
|||
end |
|||
|
|||
style V2 fill:#f9fff3 |
|||
style V1 fill:#fff6f0 |
|||
style V1Field fill:#fff6f0 |
|||
style V2Field fill:#f9fff3 |
|||
``` |
|||
|
|||
In some cases, it's even possible to have both Pydantic v1 and v2 models in the same **path operation** in your FastAPI app: |
|||
|
|||
{* ../../docs_src/pydantic_v1_in_v2/tutorial003_an_py310.py hl[2:3,6,12,21:22] *} |
|||
|
|||
In this example above, the input model is a Pydantic v1 model, and the output model (defined in `response_model=ItemV2`) is a Pydantic v2 model. |
|||
|
|||
### Pydantic v1 parameters { #pydantic-v1-parameters } |
|||
|
|||
If you need to use some of the FastAPI-specific tools for parameters like `Body`, `Query`, `Form`, etc. with Pydantic v1 models, you can import them from `fastapi.temp_pydantic_v1_params` while you finish the migration to Pydantic v2: |
|||
|
|||
{* ../../docs_src/pydantic_v1_in_v2/tutorial004_an_py310.py hl[4,18] *} |
|||
|
|||
### Migrate in steps { #migrate-in-steps } |
|||
|
|||
/// tip |
|||
|
|||
First try with `bump-pydantic`, if your tests pass and that works, then you're done in one command. ✨ |
|||
|
|||
/// |
|||
|
|||
If `bump-pydantic` doesn't work for your use case, you can use the support for both Pydantic v1 and v2 models in the same app to do the migration to Pydantic v2 gradually. |
|||
|
|||
You could fist upgrade Pydantic to use the latest version 2, and change the imports to use `pydantic.v1` for all your models. |
|||
|
|||
Then, you can start migrating your models from Pydantic v1 to v2 in groups, in gradual steps. 🚶 |
@ -0,0 +1,9 @@ |
|||
from typing import Union |
|||
|
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: Union[str, None] = None |
|||
size: float |
@ -0,0 +1,7 @@ |
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: str | None = None |
|||
size: float |
@ -0,0 +1,18 @@ |
|||
from typing import Union |
|||
|
|||
from fastapi import FastAPI |
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: Union[str, None] = None |
|||
size: float |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/items/") |
|||
async def create_item(item: Item) -> Item: |
|||
return item |
@ -0,0 +1,16 @@ |
|||
from fastapi import FastAPI |
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: str | None = None |
|||
size: float |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/items/") |
|||
async def create_item(item: Item) -> Item: |
|||
return item |
@ -0,0 +1,25 @@ |
|||
from typing import Union |
|||
|
|||
from fastapi import FastAPI |
|||
from pydantic import BaseModel as BaseModelV2 |
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: Union[str, None] = None |
|||
size: float |
|||
|
|||
|
|||
class ItemV2(BaseModelV2): |
|||
name: str |
|||
description: Union[str, None] = None |
|||
size: float |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/items/", response_model=ItemV2) |
|||
async def create_item(item: Item): |
|||
return item |
@ -0,0 +1,23 @@ |
|||
from fastapi import FastAPI |
|||
from pydantic import BaseModel as BaseModelV2 |
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: str | None = None |
|||
size: float |
|||
|
|||
|
|||
class ItemV2(BaseModelV2): |
|||
name: str |
|||
description: str | None = None |
|||
size: float |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/items/", response_model=ItemV2) |
|||
async def create_item(item: Item): |
|||
return item |
@ -0,0 +1,20 @@ |
|||
from typing import Union |
|||
|
|||
from fastapi import FastAPI |
|||
from fastapi.temp_pydantic_v1_params import Body |
|||
from pydantic.v1 import BaseModel |
|||
from typing_extensions import Annotated |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: Union[str, None] = None |
|||
size: float |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/items/") |
|||
async def create_item(item: Annotated[Item, Body(embed=True)]) -> Item: |
|||
return item |
@ -0,0 +1,19 @@ |
|||
from typing import Annotated |
|||
|
|||
from fastapi import FastAPI |
|||
from fastapi.temp_pydantic_v1_params import Body |
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: str | None = None |
|||
size: float |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/items/") |
|||
async def create_item(item: Annotated[Item, Body(embed=True)]) -> Item: |
|||
return item |
@ -0,0 +1,19 @@ |
|||
from typing import Annotated, Union |
|||
|
|||
from fastapi import FastAPI |
|||
from fastapi.temp_pydantic_v1_params import Body |
|||
from pydantic.v1 import BaseModel |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
name: str |
|||
description: Union[str, None] = None |
|||
size: float |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/items/") |
|||
async def create_item(item: Annotated[Item, Body(embed=True)]) -> Item: |
|||
return item |
@ -1,680 +0,0 @@ |
|||
import warnings |
|||
from collections import deque |
|||
from copy import copy |
|||
from dataclasses import dataclass, is_dataclass |
|||
from enum import Enum |
|||
from functools import lru_cache |
|||
from typing import ( |
|||
Any, |
|||
Callable, |
|||
Deque, |
|||
Dict, |
|||
FrozenSet, |
|||
List, |
|||
Mapping, |
|||
Sequence, |
|||
Set, |
|||
Tuple, |
|||
Type, |
|||
Union, |
|||
cast, |
|||
) |
|||
|
|||
from fastapi.exceptions import RequestErrorModel |
|||
from fastapi.types import IncEx, ModelNameMap, UnionType |
|||
from pydantic import BaseModel, create_model |
|||
from pydantic.version import VERSION as PYDANTIC_VERSION |
|||
from starlette.datastructures import UploadFile |
|||
from typing_extensions import Annotated, Literal, get_args, get_origin |
|||
|
|||
PYDANTIC_VERSION_MINOR_TUPLE = tuple(int(x) for x in PYDANTIC_VERSION.split(".")[:2]) |
|||
PYDANTIC_V2 = PYDANTIC_VERSION_MINOR_TUPLE[0] == 2 |
|||
|
|||
|
|||
sequence_annotation_to_type = { |
|||
Sequence: list, |
|||
List: list, |
|||
list: list, |
|||
Tuple: tuple, |
|||
tuple: tuple, |
|||
Set: set, |
|||
set: set, |
|||
FrozenSet: frozenset, |
|||
frozenset: frozenset, |
|||
Deque: deque, |
|||
deque: deque, |
|||
} |
|||
|
|||
sequence_types = tuple(sequence_annotation_to_type.keys()) |
|||
|
|||
Url: Type[Any] |
|||
|
|||
if PYDANTIC_V2: |
|||
from pydantic import PydanticSchemaGenerationError as PydanticSchemaGenerationError |
|||
from pydantic import TypeAdapter |
|||
from pydantic import ValidationError as ValidationError |
|||
from pydantic._internal._schema_generation_shared import ( # type: ignore[attr-defined] |
|||
GetJsonSchemaHandler as GetJsonSchemaHandler, |
|||
) |
|||
from pydantic._internal._typing_extra import eval_type_lenient |
|||
from pydantic._internal._utils import lenient_issubclass as lenient_issubclass |
|||
from pydantic.fields import FieldInfo |
|||
from pydantic.json_schema import GenerateJsonSchema as GenerateJsonSchema |
|||
from pydantic.json_schema import JsonSchemaValue as JsonSchemaValue |
|||
from pydantic_core import CoreSchema as CoreSchema |
|||
from pydantic_core import PydanticUndefined, PydanticUndefinedType |
|||
from pydantic_core import Url as Url |
|||
|
|||
try: |
|||
from pydantic_core.core_schema import ( |
|||
with_info_plain_validator_function as with_info_plain_validator_function, |
|||
) |
|||
except ImportError: # pragma: no cover |
|||
from pydantic_core.core_schema import ( |
|||
general_plain_validator_function as with_info_plain_validator_function, # noqa: F401 |
|||
) |
|||
|
|||
RequiredParam = PydanticUndefined |
|||
Undefined = PydanticUndefined |
|||
UndefinedType = PydanticUndefinedType |
|||
evaluate_forwardref = eval_type_lenient |
|||
Validator = Any |
|||
|
|||
class BaseConfig: |
|||
pass |
|||
|
|||
class ErrorWrapper(Exception): |
|||
pass |
|||
|
|||
@dataclass |
|||
class ModelField: |
|||
field_info: FieldInfo |
|||
name: str |
|||
mode: Literal["validation", "serialization"] = "validation" |
|||
|
|||
@property |
|||
def alias(self) -> str: |
|||
a = self.field_info.alias |
|||
return a if a is not None else self.name |
|||
|
|||
@property |
|||
def required(self) -> bool: |
|||
return self.field_info.is_required() |
|||
|
|||
@property |
|||
def default(self) -> Any: |
|||
return self.get_default() |
|||
|
|||
@property |
|||
def type_(self) -> Any: |
|||
return self.field_info.annotation |
|||
|
|||
def __post_init__(self) -> None: |
|||
with warnings.catch_warnings(): |
|||
# Pydantic >= 2.12.0 warns about field specific metadata that is unused |
|||
# (e.g. `TypeAdapter(Annotated[int, Field(alias='b')])`). In some cases, we |
|||
# end up building the type adapter from a model field annotation so we |
|||
# need to ignore the warning: |
|||
if PYDANTIC_VERSION_MINOR_TUPLE >= (2, 12): |
|||
from pydantic.warnings import UnsupportedFieldAttributeWarning |
|||
|
|||
warnings.simplefilter( |
|||
"ignore", category=UnsupportedFieldAttributeWarning |
|||
) |
|||
self._type_adapter: TypeAdapter[Any] = TypeAdapter( |
|||
Annotated[self.field_info.annotation, self.field_info] |
|||
) |
|||
|
|||
def get_default(self) -> Any: |
|||
if self.field_info.is_required(): |
|||
return Undefined |
|||
return self.field_info.get_default(call_default_factory=True) |
|||
|
|||
def validate( |
|||
self, |
|||
value: Any, |
|||
values: Dict[str, Any] = {}, # noqa: B006 |
|||
*, |
|||
loc: Tuple[Union[int, str], ...] = (), |
|||
) -> Tuple[Any, Union[List[Dict[str, Any]], None]]: |
|||
try: |
|||
return ( |
|||
self._type_adapter.validate_python(value, from_attributes=True), |
|||
None, |
|||
) |
|||
except ValidationError as exc: |
|||
return None, _regenerate_error_with_loc( |
|||
errors=exc.errors(include_url=False), loc_prefix=loc |
|||
) |
|||
|
|||
def serialize( |
|||
self, |
|||
value: Any, |
|||
*, |
|||
mode: Literal["json", "python"] = "json", |
|||
include: Union[IncEx, None] = None, |
|||
exclude: Union[IncEx, None] = None, |
|||
by_alias: bool = True, |
|||
exclude_unset: bool = False, |
|||
exclude_defaults: bool = False, |
|||
exclude_none: bool = False, |
|||
) -> Any: |
|||
# What calls this code passes a value that already called |
|||
# self._type_adapter.validate_python(value) |
|||
return self._type_adapter.dump_python( |
|||
value, |
|||
mode=mode, |
|||
include=include, |
|||
exclude=exclude, |
|||
by_alias=by_alias, |
|||
exclude_unset=exclude_unset, |
|||
exclude_defaults=exclude_defaults, |
|||
exclude_none=exclude_none, |
|||
) |
|||
|
|||
def __hash__(self) -> int: |
|||
# Each ModelField is unique for our purposes, to allow making a dict from |
|||
# ModelField to its JSON Schema. |
|||
return id(self) |
|||
|
|||
def get_annotation_from_field_info( |
|||
annotation: Any, field_info: FieldInfo, field_name: str |
|||
) -> Any: |
|||
return annotation |
|||
|
|||
def _normalize_errors(errors: Sequence[Any]) -> List[Dict[str, Any]]: |
|||
return errors # type: ignore[return-value] |
|||
|
|||
def _model_rebuild(model: Type[BaseModel]) -> None: |
|||
model.model_rebuild() |
|||
|
|||
def _model_dump( |
|||
model: BaseModel, mode: Literal["json", "python"] = "json", **kwargs: Any |
|||
) -> Any: |
|||
return model.model_dump(mode=mode, **kwargs) |
|||
|
|||
def _get_model_config(model: BaseModel) -> Any: |
|||
return model.model_config |
|||
|
|||
def get_schema_from_model_field( |
|||
*, |
|||
field: ModelField, |
|||
schema_generator: GenerateJsonSchema, |
|||
model_name_map: ModelNameMap, |
|||
field_mapping: Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
], |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Dict[str, Any]: |
|||
override_mode: Union[Literal["validation"], None] = ( |
|||
None if separate_input_output_schemas else "validation" |
|||
) |
|||
# This expects that GenerateJsonSchema was already used to generate the definitions |
|||
json_schema = field_mapping[(field, override_mode or field.mode)] |
|||
if "$ref" not in json_schema: |
|||
# TODO remove when deprecating Pydantic v1 |
|||
# Ref: https://github.com/pydantic/pydantic/blob/d61792cc42c80b13b23e3ffa74bc37ec7c77f7d1/pydantic/schema.py#L207 |
|||
json_schema["title"] = ( |
|||
field.field_info.title or field.alias.title().replace("_", " ") |
|||
) |
|||
return json_schema |
|||
|
|||
def get_compat_model_name_map(fields: List[ModelField]) -> ModelNameMap: |
|||
return {} |
|||
|
|||
def get_definitions( |
|||
*, |
|||
fields: List[ModelField], |
|||
schema_generator: GenerateJsonSchema, |
|||
model_name_map: ModelNameMap, |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Tuple[ |
|||
Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
], |
|||
Dict[str, Dict[str, Any]], |
|||
]: |
|||
override_mode: Union[Literal["validation"], None] = ( |
|||
None if separate_input_output_schemas else "validation" |
|||
) |
|||
inputs = [ |
|||
(field, override_mode or field.mode, field._type_adapter.core_schema) |
|||
for field in fields |
|||
] |
|||
field_mapping, definitions = schema_generator.generate_definitions( |
|||
inputs=inputs |
|||
) |
|||
for item_def in cast(Dict[str, Dict[str, Any]], definitions).values(): |
|||
if "description" in item_def: |
|||
item_description = cast(str, item_def["description"]).split("\f")[0] |
|||
item_def["description"] = item_description |
|||
return field_mapping, definitions # type: ignore[return-value] |
|||
|
|||
def is_scalar_field(field: ModelField) -> bool: |
|||
from fastapi import params |
|||
|
|||
return field_annotation_is_scalar( |
|||
field.field_info.annotation |
|||
) and not isinstance(field.field_info, params.Body) |
|||
|
|||
def is_sequence_field(field: ModelField) -> bool: |
|||
return field_annotation_is_sequence(field.field_info.annotation) |
|||
|
|||
def is_scalar_sequence_field(field: ModelField) -> bool: |
|||
return field_annotation_is_scalar_sequence(field.field_info.annotation) |
|||
|
|||
def is_bytes_field(field: ModelField) -> bool: |
|||
return is_bytes_or_nonable_bytes_annotation(field.type_) |
|||
|
|||
def is_bytes_sequence_field(field: ModelField) -> bool: |
|||
return is_bytes_sequence_annotation(field.type_) |
|||
|
|||
def copy_field_info(*, field_info: FieldInfo, annotation: Any) -> FieldInfo: |
|||
cls = type(field_info) |
|||
merged_field_info = cls.from_annotation(annotation) |
|||
new_field_info = copy(field_info) |
|||
new_field_info.metadata = merged_field_info.metadata |
|||
new_field_info.annotation = merged_field_info.annotation |
|||
return new_field_info |
|||
|
|||
def serialize_sequence_value(*, field: ModelField, value: Any) -> Sequence[Any]: |
|||
origin_type = ( |
|||
get_origin(field.field_info.annotation) or field.field_info.annotation |
|||
) |
|||
assert issubclass(origin_type, sequence_types) # type: ignore[arg-type] |
|||
return sequence_annotation_to_type[origin_type](value) # type: ignore[no-any-return] |
|||
|
|||
def get_missing_field_error(loc: Tuple[str, ...]) -> Dict[str, Any]: |
|||
error = ValidationError.from_exception_data( |
|||
"Field required", [{"type": "missing", "loc": loc, "input": {}}] |
|||
).errors(include_url=False)[0] |
|||
error["input"] = None |
|||
return error # type: ignore[return-value] |
|||
|
|||
def create_body_model( |
|||
*, fields: Sequence[ModelField], model_name: str |
|||
) -> Type[BaseModel]: |
|||
field_params = {f.name: (f.field_info.annotation, f.field_info) for f in fields} |
|||
BodyModel: Type[BaseModel] = create_model(model_name, **field_params) # type: ignore[call-overload] |
|||
return BodyModel |
|||
|
|||
def get_model_fields(model: Type[BaseModel]) -> List[ModelField]: |
|||
return [ |
|||
ModelField(field_info=field_info, name=name) |
|||
for name, field_info in model.model_fields.items() |
|||
] |
|||
|
|||
else: |
|||
from fastapi.openapi.constants import REF_PREFIX as REF_PREFIX |
|||
from pydantic import AnyUrl as Url # noqa: F401 |
|||
from pydantic import ( # type: ignore[assignment] |
|||
BaseConfig as BaseConfig, # noqa: F401 |
|||
) |
|||
from pydantic import ValidationError as ValidationError # noqa: F401 |
|||
from pydantic.class_validators import ( # type: ignore[no-redef] |
|||
Validator as Validator, # noqa: F401 |
|||
) |
|||
from pydantic.error_wrappers import ( # type: ignore[no-redef] |
|||
ErrorWrapper as ErrorWrapper, # noqa: F401 |
|||
) |
|||
from pydantic.errors import MissingError |
|||
from pydantic.fields import ( # type: ignore[attr-defined] |
|||
SHAPE_FROZENSET, |
|||
SHAPE_LIST, |
|||
SHAPE_SEQUENCE, |
|||
SHAPE_SET, |
|||
SHAPE_SINGLETON, |
|||
SHAPE_TUPLE, |
|||
SHAPE_TUPLE_ELLIPSIS, |
|||
) |
|||
from pydantic.fields import FieldInfo as FieldInfo |
|||
from pydantic.fields import ( # type: ignore[no-redef,attr-defined] |
|||
ModelField as ModelField, # noqa: F401 |
|||
) |
|||
|
|||
# Keeping old "Required" functionality from Pydantic V1, without |
|||
# shadowing typing.Required. |
|||
RequiredParam: Any = Ellipsis # type: ignore[no-redef] |
|||
from pydantic.fields import ( # type: ignore[no-redef,attr-defined] |
|||
Undefined as Undefined, |
|||
) |
|||
from pydantic.fields import ( # type: ignore[no-redef, attr-defined] |
|||
UndefinedType as UndefinedType, # noqa: F401 |
|||
) |
|||
from pydantic.schema import ( |
|||
field_schema, |
|||
get_flat_models_from_fields, |
|||
get_model_name_map, |
|||
model_process_schema, |
|||
) |
|||
from pydantic.schema import ( # type: ignore[no-redef] # noqa: F401 |
|||
get_annotation_from_field_info as get_annotation_from_field_info, |
|||
) |
|||
from pydantic.typing import ( # type: ignore[no-redef] |
|||
evaluate_forwardref as evaluate_forwardref, # noqa: F401 |
|||
) |
|||
from pydantic.utils import ( # type: ignore[no-redef] |
|||
lenient_issubclass as lenient_issubclass, # noqa: F401 |
|||
) |
|||
|
|||
GetJsonSchemaHandler = Any # type: ignore[assignment,misc] |
|||
JsonSchemaValue = Dict[str, Any] # type: ignore[misc] |
|||
CoreSchema = Any # type: ignore[assignment,misc] |
|||
|
|||
sequence_shapes = { |
|||
SHAPE_LIST, |
|||
SHAPE_SET, |
|||
SHAPE_FROZENSET, |
|||
SHAPE_TUPLE, |
|||
SHAPE_SEQUENCE, |
|||
SHAPE_TUPLE_ELLIPSIS, |
|||
} |
|||
sequence_shape_to_type = { |
|||
SHAPE_LIST: list, |
|||
SHAPE_SET: set, |
|||
SHAPE_TUPLE: tuple, |
|||
SHAPE_SEQUENCE: list, |
|||
SHAPE_TUPLE_ELLIPSIS: list, |
|||
} |
|||
|
|||
@dataclass |
|||
class GenerateJsonSchema: # type: ignore[no-redef] |
|||
ref_template: str |
|||
|
|||
class PydanticSchemaGenerationError(Exception): # type: ignore[no-redef] |
|||
pass |
|||
|
|||
def with_info_plain_validator_function( # type: ignore[misc] |
|||
function: Callable[..., Any], |
|||
*, |
|||
ref: Union[str, None] = None, |
|||
metadata: Any = None, |
|||
serialization: Any = None, |
|||
) -> Any: |
|||
return {} |
|||
|
|||
def get_model_definitions( |
|||
*, |
|||
flat_models: Set[Union[Type[BaseModel], Type[Enum]]], |
|||
model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str], |
|||
) -> Dict[str, Any]: |
|||
definitions: Dict[str, Dict[str, Any]] = {} |
|||
for model in flat_models: |
|||
m_schema, m_definitions, m_nested_models = model_process_schema( |
|||
model, model_name_map=model_name_map, ref_prefix=REF_PREFIX |
|||
) |
|||
definitions.update(m_definitions) |
|||
model_name = model_name_map[model] |
|||
definitions[model_name] = m_schema |
|||
for m_schema in definitions.values(): |
|||
if "description" in m_schema: |
|||
m_schema["description"] = m_schema["description"].split("\f")[0] |
|||
return definitions |
|||
|
|||
def is_pv1_scalar_field(field: ModelField) -> bool: |
|||
from fastapi import params |
|||
|
|||
field_info = field.field_info |
|||
if not ( |
|||
field.shape == SHAPE_SINGLETON # type: ignore[attr-defined] |
|||
and not lenient_issubclass(field.type_, BaseModel) |
|||
and not lenient_issubclass(field.type_, dict) |
|||
and not field_annotation_is_sequence(field.type_) |
|||
and not is_dataclass(field.type_) |
|||
and not isinstance(field_info, params.Body) |
|||
): |
|||
return False |
|||
if field.sub_fields: # type: ignore[attr-defined] |
|||
if not all( |
|||
is_pv1_scalar_field(f) |
|||
for f in field.sub_fields # type: ignore[attr-defined] |
|||
): |
|||
return False |
|||
return True |
|||
|
|||
def is_pv1_scalar_sequence_field(field: ModelField) -> bool: |
|||
if (field.shape in sequence_shapes) and not lenient_issubclass( # type: ignore[attr-defined] |
|||
field.type_, BaseModel |
|||
): |
|||
if field.sub_fields is not None: # type: ignore[attr-defined] |
|||
for sub_field in field.sub_fields: # type: ignore[attr-defined] |
|||
if not is_pv1_scalar_field(sub_field): |
|||
return False |
|||
return True |
|||
if _annotation_is_sequence(field.type_): |
|||
return True |
|||
return False |
|||
|
|||
def _normalize_errors(errors: Sequence[Any]) -> List[Dict[str, Any]]: |
|||
use_errors: List[Any] = [] |
|||
for error in errors: |
|||
if isinstance(error, ErrorWrapper): |
|||
new_errors = ValidationError( # type: ignore[call-arg] |
|||
errors=[error], model=RequestErrorModel |
|||
).errors() |
|||
use_errors.extend(new_errors) |
|||
elif isinstance(error, list): |
|||
use_errors.extend(_normalize_errors(error)) |
|||
else: |
|||
use_errors.append(error) |
|||
return use_errors |
|||
|
|||
def _model_rebuild(model: Type[BaseModel]) -> None: |
|||
model.update_forward_refs() |
|||
|
|||
def _model_dump( |
|||
model: BaseModel, mode: Literal["json", "python"] = "json", **kwargs: Any |
|||
) -> Any: |
|||
return model.dict(**kwargs) |
|||
|
|||
def _get_model_config(model: BaseModel) -> Any: |
|||
return model.__config__ # type: ignore[attr-defined] |
|||
|
|||
def get_schema_from_model_field( |
|||
*, |
|||
field: ModelField, |
|||
schema_generator: GenerateJsonSchema, |
|||
model_name_map: ModelNameMap, |
|||
field_mapping: Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
], |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Dict[str, Any]: |
|||
# This expects that GenerateJsonSchema was already used to generate the definitions |
|||
return field_schema( # type: ignore[no-any-return] |
|||
field, model_name_map=model_name_map, ref_prefix=REF_PREFIX |
|||
)[0] |
|||
|
|||
def get_compat_model_name_map(fields: List[ModelField]) -> ModelNameMap: |
|||
models = get_flat_models_from_fields(fields, known_models=set()) |
|||
return get_model_name_map(models) # type: ignore[no-any-return] |
|||
|
|||
def get_definitions( |
|||
*, |
|||
fields: List[ModelField], |
|||
schema_generator: GenerateJsonSchema, |
|||
model_name_map: ModelNameMap, |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Tuple[ |
|||
Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
], |
|||
Dict[str, Dict[str, Any]], |
|||
]: |
|||
models = get_flat_models_from_fields(fields, known_models=set()) |
|||
return {}, get_model_definitions( |
|||
flat_models=models, model_name_map=model_name_map |
|||
) |
|||
|
|||
def is_scalar_field(field: ModelField) -> bool: |
|||
return is_pv1_scalar_field(field) |
|||
|
|||
def is_sequence_field(field: ModelField) -> bool: |
|||
return field.shape in sequence_shapes or _annotation_is_sequence(field.type_) # type: ignore[attr-defined] |
|||
|
|||
def is_scalar_sequence_field(field: ModelField) -> bool: |
|||
return is_pv1_scalar_sequence_field(field) |
|||
|
|||
def is_bytes_field(field: ModelField) -> bool: |
|||
return lenient_issubclass(field.type_, bytes) |
|||
|
|||
def is_bytes_sequence_field(field: ModelField) -> bool: |
|||
return field.shape in sequence_shapes and lenient_issubclass(field.type_, bytes) # type: ignore[attr-defined] |
|||
|
|||
def copy_field_info(*, field_info: FieldInfo, annotation: Any) -> FieldInfo: |
|||
return copy(field_info) |
|||
|
|||
def serialize_sequence_value(*, field: ModelField, value: Any) -> Sequence[Any]: |
|||
return sequence_shape_to_type[field.shape](value) # type: ignore[no-any-return,attr-defined] |
|||
|
|||
def get_missing_field_error(loc: Tuple[str, ...]) -> Dict[str, Any]: |
|||
missing_field_error = ErrorWrapper(MissingError(), loc=loc) # type: ignore[call-arg] |
|||
new_error = ValidationError([missing_field_error], RequestErrorModel) |
|||
return new_error.errors()[0] # type: ignore[return-value] |
|||
|
|||
def create_body_model( |
|||
*, fields: Sequence[ModelField], model_name: str |
|||
) -> Type[BaseModel]: |
|||
BodyModel = create_model(model_name) |
|||
for f in fields: |
|||
BodyModel.__fields__[f.name] = f # type: ignore[index] |
|||
return BodyModel |
|||
|
|||
def get_model_fields(model: Type[BaseModel]) -> List[ModelField]: |
|||
return list(model.__fields__.values()) # type: ignore[attr-defined] |
|||
|
|||
|
|||
def _regenerate_error_with_loc( |
|||
*, errors: Sequence[Any], loc_prefix: Tuple[Union[str, int], ...] |
|||
) -> List[Dict[str, Any]]: |
|||
updated_loc_errors: List[Any] = [ |
|||
{**err, "loc": loc_prefix + err.get("loc", ())} |
|||
for err in _normalize_errors(errors) |
|||
] |
|||
|
|||
return updated_loc_errors |
|||
|
|||
|
|||
def _annotation_is_sequence(annotation: Union[Type[Any], None]) -> bool: |
|||
if lenient_issubclass(annotation, (str, bytes)): |
|||
return False |
|||
return lenient_issubclass(annotation, sequence_types) |
|||
|
|||
|
|||
def field_annotation_is_sequence(annotation: Union[Type[Any], None]) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
for arg in get_args(annotation): |
|||
if field_annotation_is_sequence(arg): |
|||
return True |
|||
return False |
|||
return _annotation_is_sequence(annotation) or _annotation_is_sequence( |
|||
get_origin(annotation) |
|||
) |
|||
|
|||
|
|||
def value_is_sequence(value: Any) -> bool: |
|||
return isinstance(value, sequence_types) and not isinstance(value, (str, bytes)) # type: ignore[arg-type] |
|||
|
|||
|
|||
def _annotation_is_complex(annotation: Union[Type[Any], None]) -> bool: |
|||
return ( |
|||
lenient_issubclass(annotation, (BaseModel, Mapping, UploadFile)) |
|||
or _annotation_is_sequence(annotation) |
|||
or is_dataclass(annotation) |
|||
) |
|||
|
|||
|
|||
def field_annotation_is_complex(annotation: Union[Type[Any], None]) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
return any(field_annotation_is_complex(arg) for arg in get_args(annotation)) |
|||
|
|||
if origin is Annotated: |
|||
return field_annotation_is_complex(get_args(annotation)[0]) |
|||
|
|||
return ( |
|||
_annotation_is_complex(annotation) |
|||
or _annotation_is_complex(origin) |
|||
or hasattr(origin, "__pydantic_core_schema__") |
|||
or hasattr(origin, "__get_pydantic_core_schema__") |
|||
) |
|||
|
|||
|
|||
def field_annotation_is_scalar(annotation: Any) -> bool: |
|||
# handle Ellipsis here to make tuple[int, ...] work nicely |
|||
return annotation is Ellipsis or not field_annotation_is_complex(annotation) |
|||
|
|||
|
|||
def field_annotation_is_scalar_sequence(annotation: Union[Type[Any], None]) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
at_least_one_scalar_sequence = False |
|||
for arg in get_args(annotation): |
|||
if field_annotation_is_scalar_sequence(arg): |
|||
at_least_one_scalar_sequence = True |
|||
continue |
|||
elif not field_annotation_is_scalar(arg): |
|||
return False |
|||
return at_least_one_scalar_sequence |
|||
return field_annotation_is_sequence(annotation) and all( |
|||
field_annotation_is_scalar(sub_annotation) |
|||
for sub_annotation in get_args(annotation) |
|||
) |
|||
|
|||
|
|||
def is_bytes_or_nonable_bytes_annotation(annotation: Any) -> bool: |
|||
if lenient_issubclass(annotation, bytes): |
|||
return True |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
for arg in get_args(annotation): |
|||
if lenient_issubclass(arg, bytes): |
|||
return True |
|||
return False |
|||
|
|||
|
|||
def is_uploadfile_or_nonable_uploadfile_annotation(annotation: Any) -> bool: |
|||
if lenient_issubclass(annotation, UploadFile): |
|||
return True |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
for arg in get_args(annotation): |
|||
if lenient_issubclass(arg, UploadFile): |
|||
return True |
|||
return False |
|||
|
|||
|
|||
def is_bytes_sequence_annotation(annotation: Any) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
at_least_one = False |
|||
for arg in get_args(annotation): |
|||
if is_bytes_sequence_annotation(arg): |
|||
at_least_one = True |
|||
continue |
|||
return at_least_one |
|||
return field_annotation_is_sequence(annotation) and all( |
|||
is_bytes_or_nonable_bytes_annotation(sub_annotation) |
|||
for sub_annotation in get_args(annotation) |
|||
) |
|||
|
|||
|
|||
def is_uploadfile_sequence_annotation(annotation: Any) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
at_least_one = False |
|||
for arg in get_args(annotation): |
|||
if is_uploadfile_sequence_annotation(arg): |
|||
at_least_one = True |
|||
continue |
|||
return at_least_one |
|||
return field_annotation_is_sequence(annotation) and all( |
|||
is_uploadfile_or_nonable_uploadfile_annotation(sub_annotation) |
|||
for sub_annotation in get_args(annotation) |
|||
) |
|||
|
|||
|
|||
@lru_cache |
|||
def get_cached_model_fields(model: Type[BaseModel]) -> List[ModelField]: |
|||
return get_model_fields(model) |
@ -0,0 +1,50 @@ |
|||
from .main import BaseConfig as BaseConfig |
|||
from .main import PydanticSchemaGenerationError as PydanticSchemaGenerationError |
|||
from .main import RequiredParam as RequiredParam |
|||
from .main import Undefined as Undefined |
|||
from .main import UndefinedType as UndefinedType |
|||
from .main import Url as Url |
|||
from .main import Validator as Validator |
|||
from .main import _get_model_config as _get_model_config |
|||
from .main import _is_error_wrapper as _is_error_wrapper |
|||
from .main import _is_model_class as _is_model_class |
|||
from .main import _is_model_field as _is_model_field |
|||
from .main import _is_undefined as _is_undefined |
|||
from .main import _model_dump as _model_dump |
|||
from .main import _model_rebuild as _model_rebuild |
|||
from .main import copy_field_info as copy_field_info |
|||
from .main import create_body_model as create_body_model |
|||
from .main import evaluate_forwardref as evaluate_forwardref |
|||
from .main import get_annotation_from_field_info as get_annotation_from_field_info |
|||
from .main import get_cached_model_fields as get_cached_model_fields |
|||
from .main import get_compat_model_name_map as get_compat_model_name_map |
|||
from .main import get_definitions as get_definitions |
|||
from .main import get_missing_field_error as get_missing_field_error |
|||
from .main import get_schema_from_model_field as get_schema_from_model_field |
|||
from .main import is_bytes_field as is_bytes_field |
|||
from .main import is_bytes_sequence_field as is_bytes_sequence_field |
|||
from .main import is_scalar_field as is_scalar_field |
|||
from .main import is_scalar_sequence_field as is_scalar_sequence_field |
|||
from .main import is_sequence_field as is_sequence_field |
|||
from .main import serialize_sequence_value as serialize_sequence_value |
|||
from .main import ( |
|||
with_info_plain_validator_function as with_info_plain_validator_function, |
|||
) |
|||
from .model_field import ModelField as ModelField |
|||
from .shared import PYDANTIC_V2 as PYDANTIC_V2 |
|||
from .shared import PYDANTIC_VERSION_MINOR_TUPLE as PYDANTIC_VERSION_MINOR_TUPLE |
|||
from .shared import annotation_is_pydantic_v1 as annotation_is_pydantic_v1 |
|||
from .shared import field_annotation_is_scalar as field_annotation_is_scalar |
|||
from .shared import ( |
|||
is_uploadfile_or_nonable_uploadfile_annotation as is_uploadfile_or_nonable_uploadfile_annotation, |
|||
) |
|||
from .shared import ( |
|||
is_uploadfile_sequence_annotation as is_uploadfile_sequence_annotation, |
|||
) |
|||
from .shared import lenient_issubclass as lenient_issubclass |
|||
from .shared import sequence_types as sequence_types |
|||
from .shared import value_is_sequence as value_is_sequence |
|||
from .v1 import CoreSchema as CoreSchema |
|||
from .v1 import GetJsonSchemaHandler as GetJsonSchemaHandler |
|||
from .v1 import JsonSchemaValue as JsonSchemaValue |
|||
from .v1 import _normalize_errors as _normalize_errors |
@ -0,0 +1,305 @@ |
|||
from functools import lru_cache |
|||
from typing import ( |
|||
Any, |
|||
Dict, |
|||
List, |
|||
Sequence, |
|||
Tuple, |
|||
Type, |
|||
) |
|||
|
|||
from fastapi._compat import v1 |
|||
from fastapi._compat.shared import PYDANTIC_V2, lenient_issubclass |
|||
from fastapi.types import ModelNameMap |
|||
from pydantic import BaseModel |
|||
from typing_extensions import Literal |
|||
|
|||
from .model_field import ModelField |
|||
|
|||
if PYDANTIC_V2: |
|||
from .v2 import BaseConfig as BaseConfig |
|||
from .v2 import FieldInfo as FieldInfo |
|||
from .v2 import PydanticSchemaGenerationError as PydanticSchemaGenerationError |
|||
from .v2 import RequiredParam as RequiredParam |
|||
from .v2 import Undefined as Undefined |
|||
from .v2 import UndefinedType as UndefinedType |
|||
from .v2 import Url as Url |
|||
from .v2 import Validator as Validator |
|||
from .v2 import evaluate_forwardref as evaluate_forwardref |
|||
from .v2 import get_missing_field_error as get_missing_field_error |
|||
from .v2 import ( |
|||
with_info_plain_validator_function as with_info_plain_validator_function, |
|||
) |
|||
else: |
|||
from .v1 import BaseConfig as BaseConfig # type: ignore[assignment] |
|||
from .v1 import FieldInfo as FieldInfo |
|||
from .v1 import ( # type: ignore[assignment] |
|||
PydanticSchemaGenerationError as PydanticSchemaGenerationError, |
|||
) |
|||
from .v1 import RequiredParam as RequiredParam |
|||
from .v1 import Undefined as Undefined |
|||
from .v1 import UndefinedType as UndefinedType |
|||
from .v1 import Url as Url # type: ignore[assignment] |
|||
from .v1 import Validator as Validator |
|||
from .v1 import evaluate_forwardref as evaluate_forwardref |
|||
from .v1 import get_missing_field_error as get_missing_field_error |
|||
from .v1 import ( # type: ignore[assignment] |
|||
with_info_plain_validator_function as with_info_plain_validator_function, |
|||
) |
|||
|
|||
|
|||
@lru_cache |
|||
def get_cached_model_fields(model: Type[BaseModel]) -> List[ModelField]: |
|||
if lenient_issubclass(model, v1.BaseModel): |
|||
return v1.get_model_fields(model) |
|||
else: |
|||
from . import v2 |
|||
|
|||
return v2.get_model_fields(model) # type: ignore[return-value] |
|||
|
|||
|
|||
def _is_undefined(value: object) -> bool: |
|||
if isinstance(value, v1.UndefinedType): |
|||
return True |
|||
elif PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
return isinstance(value, v2.UndefinedType) |
|||
return False |
|||
|
|||
|
|||
def _get_model_config(model: BaseModel) -> Any: |
|||
if isinstance(model, v1.BaseModel): |
|||
return v1._get_model_config(model) |
|||
elif PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
return v2._get_model_config(model) |
|||
|
|||
|
|||
def _model_dump( |
|||
model: BaseModel, mode: Literal["json", "python"] = "json", **kwargs: Any |
|||
) -> Any: |
|||
if isinstance(model, v1.BaseModel): |
|||
return v1._model_dump(model, mode=mode, **kwargs) |
|||
elif PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
return v2._model_dump(model, mode=mode, **kwargs) |
|||
|
|||
|
|||
def _is_error_wrapper(exc: Exception) -> bool: |
|||
if isinstance(exc, v1.ErrorWrapper): |
|||
return True |
|||
elif PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
return isinstance(exc, v2.ErrorWrapper) |
|||
return False |
|||
|
|||
|
|||
def copy_field_info(*, field_info: FieldInfo, annotation: Any) -> FieldInfo: |
|||
if isinstance(field_info, v1.FieldInfo): |
|||
return v1.copy_field_info(field_info=field_info, annotation=annotation) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.copy_field_info(field_info=field_info, annotation=annotation) |
|||
|
|||
|
|||
def create_body_model( |
|||
*, fields: Sequence[ModelField], model_name: str |
|||
) -> Type[BaseModel]: |
|||
if fields and isinstance(fields[0], v1.ModelField): |
|||
return v1.create_body_model(fields=fields, model_name=model_name) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.create_body_model(fields=fields, model_name=model_name) # type: ignore[arg-type] |
|||
|
|||
|
|||
def get_annotation_from_field_info( |
|||
annotation: Any, field_info: FieldInfo, field_name: str |
|||
) -> Any: |
|||
if isinstance(field_info, v1.FieldInfo): |
|||
return v1.get_annotation_from_field_info( |
|||
annotation=annotation, field_info=field_info, field_name=field_name |
|||
) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.get_annotation_from_field_info( |
|||
annotation=annotation, field_info=field_info, field_name=field_name |
|||
) |
|||
|
|||
|
|||
def is_bytes_field(field: ModelField) -> bool: |
|||
if isinstance(field, v1.ModelField): |
|||
return v1.is_bytes_field(field) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.is_bytes_field(field) # type: ignore[arg-type] |
|||
|
|||
|
|||
def is_bytes_sequence_field(field: ModelField) -> bool: |
|||
if isinstance(field, v1.ModelField): |
|||
return v1.is_bytes_sequence_field(field) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.is_bytes_sequence_field(field) # type: ignore[arg-type] |
|||
|
|||
|
|||
def is_scalar_field(field: ModelField) -> bool: |
|||
if isinstance(field, v1.ModelField): |
|||
return v1.is_scalar_field(field) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.is_scalar_field(field) # type: ignore[arg-type] |
|||
|
|||
|
|||
def is_scalar_sequence_field(field: ModelField) -> bool: |
|||
if isinstance(field, v1.ModelField): |
|||
return v1.is_scalar_sequence_field(field) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.is_scalar_sequence_field(field) # type: ignore[arg-type] |
|||
|
|||
|
|||
def is_sequence_field(field: ModelField) -> bool: |
|||
if isinstance(field, v1.ModelField): |
|||
return v1.is_sequence_field(field) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.is_sequence_field(field) # type: ignore[arg-type] |
|||
|
|||
|
|||
def serialize_sequence_value(*, field: ModelField, value: Any) -> Sequence[Any]: |
|||
if isinstance(field, v1.ModelField): |
|||
return v1.serialize_sequence_value(field=field, value=value) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.serialize_sequence_value(field=field, value=value) # type: ignore[arg-type] |
|||
|
|||
|
|||
def _model_rebuild(model: Type[BaseModel]) -> None: |
|||
if lenient_issubclass(model, v1.BaseModel): |
|||
v1._model_rebuild(model) |
|||
elif PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
v2._model_rebuild(model) |
|||
|
|||
|
|||
def get_compat_model_name_map(fields: List[ModelField]) -> ModelNameMap: |
|||
v1_model_fields = [field for field in fields if isinstance(field, v1.ModelField)] |
|||
v1_flat_models = v1.get_flat_models_from_fields(v1_model_fields, known_models=set()) # type: ignore[attr-defined] |
|||
all_flat_models = v1_flat_models |
|||
if PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
v2_model_fields = [ |
|||
field for field in fields if isinstance(field, v2.ModelField) |
|||
] |
|||
v2_flat_models = v2.get_flat_models_from_fields( |
|||
v2_model_fields, known_models=set() |
|||
) |
|||
all_flat_models = all_flat_models.union(v2_flat_models) |
|||
|
|||
model_name_map = v2.get_model_name_map(all_flat_models) |
|||
return model_name_map |
|||
model_name_map = v1.get_model_name_map(all_flat_models) |
|||
return model_name_map |
|||
|
|||
|
|||
def get_definitions( |
|||
*, |
|||
fields: List[ModelField], |
|||
model_name_map: ModelNameMap, |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Tuple[ |
|||
Dict[Tuple[ModelField, Literal["validation", "serialization"]], v1.JsonSchemaValue], |
|||
Dict[str, Dict[str, Any]], |
|||
]: |
|||
v1_fields = [field for field in fields if isinstance(field, v1.ModelField)] |
|||
v1_field_maps, v1_definitions = v1.get_definitions( |
|||
fields=v1_fields, |
|||
model_name_map=model_name_map, |
|||
separate_input_output_schemas=separate_input_output_schemas, |
|||
) |
|||
if not PYDANTIC_V2: |
|||
return v1_field_maps, v1_definitions |
|||
else: |
|||
from . import v2 |
|||
|
|||
v2_fields = [field for field in fields if isinstance(field, v2.ModelField)] |
|||
v2_field_maps, v2_definitions = v2.get_definitions( |
|||
fields=v2_fields, |
|||
model_name_map=model_name_map, |
|||
separate_input_output_schemas=separate_input_output_schemas, |
|||
) |
|||
all_definitions = {**v1_definitions, **v2_definitions} |
|||
all_field_maps = {**v1_field_maps, **v2_field_maps} |
|||
return all_field_maps, all_definitions |
|||
|
|||
|
|||
def get_schema_from_model_field( |
|||
*, |
|||
field: ModelField, |
|||
model_name_map: ModelNameMap, |
|||
field_mapping: Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], v1.JsonSchemaValue |
|||
], |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Dict[str, Any]: |
|||
if isinstance(field, v1.ModelField): |
|||
return v1.get_schema_from_model_field( |
|||
field=field, |
|||
model_name_map=model_name_map, |
|||
field_mapping=field_mapping, |
|||
separate_input_output_schemas=separate_input_output_schemas, |
|||
) |
|||
else: |
|||
assert PYDANTIC_V2 |
|||
from . import v2 |
|||
|
|||
return v2.get_schema_from_model_field( |
|||
field=field, # type: ignore[arg-type] |
|||
model_name_map=model_name_map, |
|||
field_mapping=field_mapping, # type: ignore[arg-type] |
|||
separate_input_output_schemas=separate_input_output_schemas, |
|||
) |
|||
|
|||
|
|||
def _is_model_field(value: Any) -> bool: |
|||
if isinstance(value, v1.ModelField): |
|||
return True |
|||
elif PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
return isinstance(value, v2.ModelField) |
|||
return False |
|||
|
|||
|
|||
def _is_model_class(value: Any) -> bool: |
|||
if lenient_issubclass(value, v1.BaseModel): |
|||
return True |
|||
elif PYDANTIC_V2: |
|||
from . import v2 |
|||
|
|||
return lenient_issubclass(value, v2.BaseModel) # type: ignore[attr-defined] |
|||
return False |
@ -0,0 +1,53 @@ |
|||
from typing import ( |
|||
Any, |
|||
Dict, |
|||
List, |
|||
Tuple, |
|||
Union, |
|||
) |
|||
|
|||
from fastapi.types import IncEx |
|||
from pydantic.fields import FieldInfo |
|||
from typing_extensions import Literal, Protocol |
|||
|
|||
|
|||
class ModelField(Protocol): |
|||
field_info: "FieldInfo" |
|||
name: str |
|||
mode: Literal["validation", "serialization"] = "validation" |
|||
_version: Literal["v1", "v2"] = "v1" |
|||
|
|||
@property |
|||
def alias(self) -> str: ... |
|||
|
|||
@property |
|||
def required(self) -> bool: ... |
|||
|
|||
@property |
|||
def default(self) -> Any: ... |
|||
|
|||
@property |
|||
def type_(self) -> Any: ... |
|||
|
|||
def get_default(self) -> Any: ... |
|||
|
|||
def validate( |
|||
self, |
|||
value: Any, |
|||
values: Dict[str, Any] = {}, # noqa: B006 |
|||
*, |
|||
loc: Tuple[Union[int, str], ...] = (), |
|||
) -> Tuple[Any, Union[List[Dict[str, Any]], None]]: ... |
|||
|
|||
def serialize( |
|||
self, |
|||
value: Any, |
|||
*, |
|||
mode: Literal["json", "python"] = "json", |
|||
include: Union[IncEx, None] = None, |
|||
exclude: Union[IncEx, None] = None, |
|||
by_alias: bool = True, |
|||
exclude_unset: bool = False, |
|||
exclude_defaults: bool = False, |
|||
exclude_none: bool = False, |
|||
) -> Any: ... |
@ -0,0 +1,209 @@ |
|||
import sys |
|||
import types |
|||
import typing |
|||
from collections import deque |
|||
from dataclasses import is_dataclass |
|||
from typing import ( |
|||
Any, |
|||
Deque, |
|||
FrozenSet, |
|||
List, |
|||
Mapping, |
|||
Sequence, |
|||
Set, |
|||
Tuple, |
|||
Type, |
|||
Union, |
|||
) |
|||
|
|||
from fastapi._compat import v1 |
|||
from fastapi.types import UnionType |
|||
from pydantic import BaseModel |
|||
from pydantic.version import VERSION as PYDANTIC_VERSION |
|||
from starlette.datastructures import UploadFile |
|||
from typing_extensions import Annotated, get_args, get_origin |
|||
|
|||
# Copy from Pydantic v2, compatible with v1 |
|||
if sys.version_info < (3, 9): |
|||
# Pydantic no longer supports Python 3.8, this might be incorrect, but the code |
|||
# this is used for is also never reached in this codebase, as it's a copy of |
|||
# Pydantic's lenient_issubclass, just for compatibility with v1 |
|||
# TODO: remove when dropping support for Python 3.8 |
|||
WithArgsTypes: Tuple[Any, ...] = () |
|||
elif sys.version_info < (3, 10): |
|||
WithArgsTypes: tuple[Any, ...] = (typing._GenericAlias, types.GenericAlias) # type: ignore[attr-defined] |
|||
else: |
|||
WithArgsTypes: tuple[Any, ...] = ( |
|||
typing._GenericAlias, # type: ignore[attr-defined] |
|||
types.GenericAlias, |
|||
types.UnionType, |
|||
) # pyright: ignore[reportAttributeAccessIssue] |
|||
|
|||
PYDANTIC_VERSION_MINOR_TUPLE = tuple(int(x) for x in PYDANTIC_VERSION.split(".")[:2]) |
|||
PYDANTIC_V2 = PYDANTIC_VERSION_MINOR_TUPLE[0] == 2 |
|||
|
|||
|
|||
sequence_annotation_to_type = { |
|||
Sequence: list, |
|||
List: list, |
|||
list: list, |
|||
Tuple: tuple, |
|||
tuple: tuple, |
|||
Set: set, |
|||
set: set, |
|||
FrozenSet: frozenset, |
|||
frozenset: frozenset, |
|||
Deque: deque, |
|||
deque: deque, |
|||
} |
|||
|
|||
sequence_types = tuple(sequence_annotation_to_type.keys()) |
|||
|
|||
Url: Type[Any] |
|||
|
|||
|
|||
# Copy of Pydantic v2, compatible with v1 |
|||
def lenient_issubclass( |
|||
cls: Any, class_or_tuple: Union[Type[Any], Tuple[Type[Any], ...], None] |
|||
) -> bool: |
|||
try: |
|||
return isinstance(cls, type) and issubclass(cls, class_or_tuple) # type: ignore[arg-type] |
|||
except TypeError: # pragma: no cover |
|||
if isinstance(cls, WithArgsTypes): |
|||
return False |
|||
raise # pragma: no cover |
|||
|
|||
|
|||
def _annotation_is_sequence(annotation: Union[Type[Any], None]) -> bool: |
|||
if lenient_issubclass(annotation, (str, bytes)): |
|||
return False |
|||
return lenient_issubclass(annotation, sequence_types) # type: ignore[arg-type] |
|||
|
|||
|
|||
def field_annotation_is_sequence(annotation: Union[Type[Any], None]) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
for arg in get_args(annotation): |
|||
if field_annotation_is_sequence(arg): |
|||
return True |
|||
return False |
|||
return _annotation_is_sequence(annotation) or _annotation_is_sequence( |
|||
get_origin(annotation) |
|||
) |
|||
|
|||
|
|||
def value_is_sequence(value: Any) -> bool: |
|||
return isinstance(value, sequence_types) and not isinstance(value, (str, bytes)) # type: ignore[arg-type] |
|||
|
|||
|
|||
def _annotation_is_complex(annotation: Union[Type[Any], None]) -> bool: |
|||
return ( |
|||
lenient_issubclass(annotation, (BaseModel, v1.BaseModel, Mapping, UploadFile)) |
|||
or _annotation_is_sequence(annotation) |
|||
or is_dataclass(annotation) |
|||
) |
|||
|
|||
|
|||
def field_annotation_is_complex(annotation: Union[Type[Any], None]) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
return any(field_annotation_is_complex(arg) for arg in get_args(annotation)) |
|||
|
|||
if origin is Annotated: |
|||
return field_annotation_is_complex(get_args(annotation)[0]) |
|||
|
|||
return ( |
|||
_annotation_is_complex(annotation) |
|||
or _annotation_is_complex(origin) |
|||
or hasattr(origin, "__pydantic_core_schema__") |
|||
or hasattr(origin, "__get_pydantic_core_schema__") |
|||
) |
|||
|
|||
|
|||
def field_annotation_is_scalar(annotation: Any) -> bool: |
|||
# handle Ellipsis here to make tuple[int, ...] work nicely |
|||
return annotation is Ellipsis or not field_annotation_is_complex(annotation) |
|||
|
|||
|
|||
def field_annotation_is_scalar_sequence(annotation: Union[Type[Any], None]) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
at_least_one_scalar_sequence = False |
|||
for arg in get_args(annotation): |
|||
if field_annotation_is_scalar_sequence(arg): |
|||
at_least_one_scalar_sequence = True |
|||
continue |
|||
elif not field_annotation_is_scalar(arg): |
|||
return False |
|||
return at_least_one_scalar_sequence |
|||
return field_annotation_is_sequence(annotation) and all( |
|||
field_annotation_is_scalar(sub_annotation) |
|||
for sub_annotation in get_args(annotation) |
|||
) |
|||
|
|||
|
|||
def is_bytes_or_nonable_bytes_annotation(annotation: Any) -> bool: |
|||
if lenient_issubclass(annotation, bytes): |
|||
return True |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
for arg in get_args(annotation): |
|||
if lenient_issubclass(arg, bytes): |
|||
return True |
|||
return False |
|||
|
|||
|
|||
def is_uploadfile_or_nonable_uploadfile_annotation(annotation: Any) -> bool: |
|||
if lenient_issubclass(annotation, UploadFile): |
|||
return True |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
for arg in get_args(annotation): |
|||
if lenient_issubclass(arg, UploadFile): |
|||
return True |
|||
return False |
|||
|
|||
|
|||
def is_bytes_sequence_annotation(annotation: Any) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
at_least_one = False |
|||
for arg in get_args(annotation): |
|||
if is_bytes_sequence_annotation(arg): |
|||
at_least_one = True |
|||
continue |
|||
return at_least_one |
|||
return field_annotation_is_sequence(annotation) and all( |
|||
is_bytes_or_nonable_bytes_annotation(sub_annotation) |
|||
for sub_annotation in get_args(annotation) |
|||
) |
|||
|
|||
|
|||
def is_uploadfile_sequence_annotation(annotation: Any) -> bool: |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
at_least_one = False |
|||
for arg in get_args(annotation): |
|||
if is_uploadfile_sequence_annotation(arg): |
|||
at_least_one = True |
|||
continue |
|||
return at_least_one |
|||
return field_annotation_is_sequence(annotation) and all( |
|||
is_uploadfile_or_nonable_uploadfile_annotation(sub_annotation) |
|||
for sub_annotation in get_args(annotation) |
|||
) |
|||
|
|||
|
|||
def annotation_is_pydantic_v1(annotation: Any) -> bool: |
|||
if lenient_issubclass(annotation, v1.BaseModel): |
|||
return True |
|||
origin = get_origin(annotation) |
|||
if origin is Union or origin is UnionType: |
|||
for arg in get_args(annotation): |
|||
if lenient_issubclass(arg, v1.BaseModel): |
|||
return True |
|||
if field_annotation_is_sequence(annotation): |
|||
for sub_annotation in get_args(annotation): |
|||
if annotation_is_pydantic_v1(sub_annotation): |
|||
return True |
|||
return False |
@ -0,0 +1,334 @@ |
|||
from copy import copy |
|||
from dataclasses import dataclass, is_dataclass |
|||
from enum import Enum |
|||
from typing import ( |
|||
Any, |
|||
Callable, |
|||
Dict, |
|||
List, |
|||
Sequence, |
|||
Set, |
|||
Tuple, |
|||
Type, |
|||
Union, |
|||
) |
|||
|
|||
from fastapi._compat import shared |
|||
from fastapi.openapi.constants import REF_PREFIX as REF_PREFIX |
|||
from fastapi.types import ModelNameMap |
|||
from pydantic.version import VERSION as PYDANTIC_VERSION |
|||
from typing_extensions import Literal |
|||
|
|||
PYDANTIC_VERSION_MINOR_TUPLE = tuple(int(x) for x in PYDANTIC_VERSION.split(".")[:2]) |
|||
PYDANTIC_V2 = PYDANTIC_VERSION_MINOR_TUPLE[0] == 2 |
|||
# Keeping old "Required" functionality from Pydantic V1, without |
|||
# shadowing typing.Required. |
|||
RequiredParam: Any = Ellipsis |
|||
|
|||
if not PYDANTIC_V2: |
|||
from pydantic import BaseConfig as BaseConfig |
|||
from pydantic import BaseModel as BaseModel |
|||
from pydantic import ValidationError as ValidationError |
|||
from pydantic import create_model as create_model |
|||
from pydantic.class_validators import Validator as Validator |
|||
from pydantic.color import Color as Color |
|||
from pydantic.error_wrappers import ErrorWrapper as ErrorWrapper |
|||
from pydantic.errors import MissingError |
|||
from pydantic.fields import ( # type: ignore[attr-defined] |
|||
SHAPE_FROZENSET, |
|||
SHAPE_LIST, |
|||
SHAPE_SEQUENCE, |
|||
SHAPE_SET, |
|||
SHAPE_SINGLETON, |
|||
SHAPE_TUPLE, |
|||
SHAPE_TUPLE_ELLIPSIS, |
|||
) |
|||
from pydantic.fields import FieldInfo as FieldInfo |
|||
from pydantic.fields import ModelField as ModelField # type: ignore[attr-defined] |
|||
from pydantic.fields import Undefined as Undefined # type: ignore[attr-defined] |
|||
from pydantic.fields import ( # type: ignore[attr-defined] |
|||
UndefinedType as UndefinedType, |
|||
) |
|||
from pydantic.networks import AnyUrl as AnyUrl |
|||
from pydantic.networks import NameEmail as NameEmail |
|||
from pydantic.schema import TypeModelSet as TypeModelSet |
|||
from pydantic.schema import ( |
|||
field_schema, |
|||
get_flat_models_from_fields, |
|||
model_process_schema, |
|||
) |
|||
from pydantic.schema import ( |
|||
get_annotation_from_field_info as get_annotation_from_field_info, |
|||
) |
|||
from pydantic.schema import get_flat_models_from_field as get_flat_models_from_field |
|||
from pydantic.schema import get_model_name_map as get_model_name_map |
|||
from pydantic.types import SecretBytes as SecretBytes |
|||
from pydantic.types import SecretStr as SecretStr |
|||
from pydantic.typing import evaluate_forwardref as evaluate_forwardref |
|||
from pydantic.utils import lenient_issubclass as lenient_issubclass |
|||
|
|||
|
|||
else: |
|||
from pydantic.v1 import BaseConfig as BaseConfig # type: ignore[assignment] |
|||
from pydantic.v1 import BaseModel as BaseModel # type: ignore[assignment] |
|||
from pydantic.v1 import ( # type: ignore[assignment] |
|||
ValidationError as ValidationError, |
|||
) |
|||
from pydantic.v1 import create_model as create_model # type: ignore[no-redef] |
|||
from pydantic.v1.class_validators import Validator as Validator |
|||
from pydantic.v1.color import Color as Color # type: ignore[assignment] |
|||
from pydantic.v1.error_wrappers import ErrorWrapper as ErrorWrapper |
|||
from pydantic.v1.errors import MissingError |
|||
from pydantic.v1.fields import ( |
|||
SHAPE_FROZENSET, |
|||
SHAPE_LIST, |
|||
SHAPE_SEQUENCE, |
|||
SHAPE_SET, |
|||
SHAPE_SINGLETON, |
|||
SHAPE_TUPLE, |
|||
SHAPE_TUPLE_ELLIPSIS, |
|||
) |
|||
from pydantic.v1.fields import FieldInfo as FieldInfo # type: ignore[assignment] |
|||
from pydantic.v1.fields import ModelField as ModelField |
|||
from pydantic.v1.fields import Undefined as Undefined |
|||
from pydantic.v1.fields import UndefinedType as UndefinedType |
|||
from pydantic.v1.networks import AnyUrl as AnyUrl |
|||
from pydantic.v1.networks import ( # type: ignore[assignment] |
|||
NameEmail as NameEmail, |
|||
) |
|||
from pydantic.v1.schema import TypeModelSet as TypeModelSet |
|||
from pydantic.v1.schema import ( |
|||
field_schema, |
|||
get_flat_models_from_fields, |
|||
model_process_schema, |
|||
) |
|||
from pydantic.v1.schema import ( |
|||
get_annotation_from_field_info as get_annotation_from_field_info, |
|||
) |
|||
from pydantic.v1.schema import ( |
|||
get_flat_models_from_field as get_flat_models_from_field, |
|||
) |
|||
from pydantic.v1.schema import get_model_name_map as get_model_name_map |
|||
from pydantic.v1.types import ( # type: ignore[assignment] |
|||
SecretBytes as SecretBytes, |
|||
) |
|||
from pydantic.v1.types import ( # type: ignore[assignment] |
|||
SecretStr as SecretStr, |
|||
) |
|||
from pydantic.v1.typing import evaluate_forwardref as evaluate_forwardref |
|||
from pydantic.v1.utils import lenient_issubclass as lenient_issubclass |
|||
|
|||
|
|||
GetJsonSchemaHandler = Any |
|||
JsonSchemaValue = Dict[str, Any] |
|||
CoreSchema = Any |
|||
Url = AnyUrl |
|||
|
|||
sequence_shapes = { |
|||
SHAPE_LIST, |
|||
SHAPE_SET, |
|||
SHAPE_FROZENSET, |
|||
SHAPE_TUPLE, |
|||
SHAPE_SEQUENCE, |
|||
SHAPE_TUPLE_ELLIPSIS, |
|||
} |
|||
sequence_shape_to_type = { |
|||
SHAPE_LIST: list, |
|||
SHAPE_SET: set, |
|||
SHAPE_TUPLE: tuple, |
|||
SHAPE_SEQUENCE: list, |
|||
SHAPE_TUPLE_ELLIPSIS: list, |
|||
} |
|||
|
|||
|
|||
@dataclass |
|||
class GenerateJsonSchema: |
|||
ref_template: str |
|||
|
|||
|
|||
class PydanticSchemaGenerationError(Exception): |
|||
pass |
|||
|
|||
|
|||
RequestErrorModel: Type[BaseModel] = create_model("Request") |
|||
|
|||
|
|||
def with_info_plain_validator_function( |
|||
function: Callable[..., Any], |
|||
*, |
|||
ref: Union[str, None] = None, |
|||
metadata: Any = None, |
|||
serialization: Any = None, |
|||
) -> Any: |
|||
return {} |
|||
|
|||
|
|||
def get_model_definitions( |
|||
*, |
|||
flat_models: Set[Union[Type[BaseModel], Type[Enum]]], |
|||
model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str], |
|||
) -> Dict[str, Any]: |
|||
definitions: Dict[str, Dict[str, Any]] = {} |
|||
for model in flat_models: |
|||
m_schema, m_definitions, m_nested_models = model_process_schema( |
|||
model, model_name_map=model_name_map, ref_prefix=REF_PREFIX |
|||
) |
|||
definitions.update(m_definitions) |
|||
model_name = model_name_map[model] |
|||
definitions[model_name] = m_schema |
|||
for m_schema in definitions.values(): |
|||
if "description" in m_schema: |
|||
m_schema["description"] = m_schema["description"].split("\f")[0] |
|||
return definitions |
|||
|
|||
|
|||
def is_pv1_scalar_field(field: ModelField) -> bool: |
|||
from fastapi import params |
|||
|
|||
field_info = field.field_info |
|||
if not ( |
|||
field.shape == SHAPE_SINGLETON |
|||
and not lenient_issubclass(field.type_, BaseModel) |
|||
and not lenient_issubclass(field.type_, dict) |
|||
and not shared.field_annotation_is_sequence(field.type_) |
|||
and not is_dataclass(field.type_) |
|||
and not isinstance(field_info, params.Body) |
|||
): |
|||
return False |
|||
if field.sub_fields: |
|||
if not all(is_pv1_scalar_field(f) for f in field.sub_fields): |
|||
return False |
|||
return True |
|||
|
|||
|
|||
def is_pv1_scalar_sequence_field(field: ModelField) -> bool: |
|||
if (field.shape in sequence_shapes) and not lenient_issubclass( |
|||
field.type_, BaseModel |
|||
): |
|||
if field.sub_fields is not None: |
|||
for sub_field in field.sub_fields: |
|||
if not is_pv1_scalar_field(sub_field): |
|||
return False |
|||
return True |
|||
if shared._annotation_is_sequence(field.type_): |
|||
return True |
|||
return False |
|||
|
|||
|
|||
def _normalize_errors(errors: Sequence[Any]) -> List[Dict[str, Any]]: |
|||
use_errors: List[Any] = [] |
|||
for error in errors: |
|||
if isinstance(error, ErrorWrapper): |
|||
new_errors = ValidationError( # type: ignore[call-arg] |
|||
errors=[error], model=RequestErrorModel |
|||
).errors() |
|||
use_errors.extend(new_errors) |
|||
elif isinstance(error, list): |
|||
use_errors.extend(_normalize_errors(error)) |
|||
else: |
|||
use_errors.append(error) |
|||
return use_errors |
|||
|
|||
|
|||
def _regenerate_error_with_loc( |
|||
*, errors: Sequence[Any], loc_prefix: Tuple[Union[str, int], ...] |
|||
) -> List[Dict[str, Any]]: |
|||
updated_loc_errors: List[Any] = [ |
|||
{**err, "loc": loc_prefix + err.get("loc", ())} |
|||
for err in _normalize_errors(errors) |
|||
] |
|||
|
|||
return updated_loc_errors |
|||
|
|||
|
|||
def _model_rebuild(model: Type[BaseModel]) -> None: |
|||
model.update_forward_refs() |
|||
|
|||
|
|||
def _model_dump( |
|||
model: BaseModel, mode: Literal["json", "python"] = "json", **kwargs: Any |
|||
) -> Any: |
|||
return model.dict(**kwargs) |
|||
|
|||
|
|||
def _get_model_config(model: BaseModel) -> Any: |
|||
return model.__config__ # type: ignore[attr-defined] |
|||
|
|||
|
|||
def get_schema_from_model_field( |
|||
*, |
|||
field: ModelField, |
|||
model_name_map: ModelNameMap, |
|||
field_mapping: Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
], |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Dict[str, Any]: |
|||
return field_schema( # type: ignore[no-any-return] |
|||
field, model_name_map=model_name_map, ref_prefix=REF_PREFIX |
|||
)[0] |
|||
|
|||
|
|||
# def get_compat_model_name_map(fields: List[ModelField]) -> ModelNameMap: |
|||
# models = get_flat_models_from_fields(fields, known_models=set()) |
|||
# return get_model_name_map(models) # type: ignore[no-any-return] |
|||
|
|||
|
|||
def get_definitions( |
|||
*, |
|||
fields: List[ModelField], |
|||
model_name_map: ModelNameMap, |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Tuple[ |
|||
Dict[Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue], |
|||
Dict[str, Dict[str, Any]], |
|||
]: |
|||
models = get_flat_models_from_fields(fields, known_models=set()) |
|||
return {}, get_model_definitions(flat_models=models, model_name_map=model_name_map) |
|||
|
|||
|
|||
def is_scalar_field(field: ModelField) -> bool: |
|||
return is_pv1_scalar_field(field) |
|||
|
|||
|
|||
def is_sequence_field(field: ModelField) -> bool: |
|||
return field.shape in sequence_shapes or shared._annotation_is_sequence(field.type_) |
|||
|
|||
|
|||
def is_scalar_sequence_field(field: ModelField) -> bool: |
|||
return is_pv1_scalar_sequence_field(field) |
|||
|
|||
|
|||
def is_bytes_field(field: ModelField) -> bool: |
|||
return lenient_issubclass(field.type_, bytes) # type: ignore[no-any-return] |
|||
|
|||
|
|||
def is_bytes_sequence_field(field: ModelField) -> bool: |
|||
return field.shape in sequence_shapes and lenient_issubclass(field.type_, bytes) |
|||
|
|||
|
|||
def copy_field_info(*, field_info: FieldInfo, annotation: Any) -> FieldInfo: |
|||
return copy(field_info) |
|||
|
|||
|
|||
def serialize_sequence_value(*, field: ModelField, value: Any) -> Sequence[Any]: |
|||
return sequence_shape_to_type[field.shape](value) # type: ignore[no-any-return] |
|||
|
|||
|
|||
def get_missing_field_error(loc: Tuple[str, ...]) -> Dict[str, Any]: |
|||
missing_field_error = ErrorWrapper(MissingError(), loc=loc) |
|||
new_error = ValidationError([missing_field_error], RequestErrorModel) |
|||
return new_error.errors()[0] # type: ignore[return-value] |
|||
|
|||
|
|||
def create_body_model( |
|||
*, fields: Sequence[ModelField], model_name: str |
|||
) -> Type[BaseModel]: |
|||
BodyModel = create_model(model_name) |
|||
for f in fields: |
|||
BodyModel.__fields__[f.name] = f # type: ignore[index] |
|||
return BodyModel |
|||
|
|||
|
|||
def get_model_fields(model: Type[BaseModel]) -> List[ModelField]: |
|||
return list(model.__fields__.values()) # type: ignore[attr-defined] |
@ -0,0 +1,459 @@ |
|||
import re |
|||
import warnings |
|||
from copy import copy, deepcopy |
|||
from dataclasses import dataclass |
|||
from enum import Enum |
|||
from typing import ( |
|||
Any, |
|||
Dict, |
|||
List, |
|||
Sequence, |
|||
Set, |
|||
Tuple, |
|||
Type, |
|||
Union, |
|||
cast, |
|||
) |
|||
|
|||
from fastapi._compat import shared, v1 |
|||
from fastapi.openapi.constants import REF_TEMPLATE |
|||
from fastapi.types import IncEx, ModelNameMap |
|||
from pydantic import BaseModel, TypeAdapter, create_model |
|||
from pydantic import PydanticSchemaGenerationError as PydanticSchemaGenerationError |
|||
from pydantic import PydanticUndefinedAnnotation as PydanticUndefinedAnnotation |
|||
from pydantic import ValidationError as ValidationError |
|||
from pydantic._internal._schema_generation_shared import ( # type: ignore[attr-defined] |
|||
GetJsonSchemaHandler as GetJsonSchemaHandler, |
|||
) |
|||
from pydantic._internal._typing_extra import eval_type_lenient |
|||
from pydantic._internal._utils import lenient_issubclass as lenient_issubclass |
|||
from pydantic.fields import FieldInfo as FieldInfo |
|||
from pydantic.json_schema import GenerateJsonSchema as GenerateJsonSchema |
|||
from pydantic.json_schema import JsonSchemaValue as JsonSchemaValue |
|||
from pydantic_core import CoreSchema as CoreSchema |
|||
from pydantic_core import PydanticUndefined, PydanticUndefinedType |
|||
from pydantic_core import Url as Url |
|||
from typing_extensions import Annotated, Literal, get_args, get_origin |
|||
|
|||
try: |
|||
from pydantic_core.core_schema import ( |
|||
with_info_plain_validator_function as with_info_plain_validator_function, |
|||
) |
|||
except ImportError: # pragma: no cover |
|||
from pydantic_core.core_schema import ( |
|||
general_plain_validator_function as with_info_plain_validator_function, # noqa: F401 |
|||
) |
|||
|
|||
RequiredParam = PydanticUndefined |
|||
Undefined = PydanticUndefined |
|||
UndefinedType = PydanticUndefinedType |
|||
evaluate_forwardref = eval_type_lenient |
|||
Validator = Any |
|||
|
|||
|
|||
class BaseConfig: |
|||
pass |
|||
|
|||
|
|||
class ErrorWrapper(Exception): |
|||
pass |
|||
|
|||
|
|||
@dataclass |
|||
class ModelField: |
|||
field_info: FieldInfo |
|||
name: str |
|||
mode: Literal["validation", "serialization"] = "validation" |
|||
|
|||
@property |
|||
def alias(self) -> str: |
|||
a = self.field_info.alias |
|||
return a if a is not None else self.name |
|||
|
|||
@property |
|||
def required(self) -> bool: |
|||
return self.field_info.is_required() |
|||
|
|||
@property |
|||
def default(self) -> Any: |
|||
return self.get_default() |
|||
|
|||
@property |
|||
def type_(self) -> Any: |
|||
return self.field_info.annotation |
|||
|
|||
def __post_init__(self) -> None: |
|||
with warnings.catch_warnings(): |
|||
# Pydantic >= 2.12.0 warns about field specific metadata that is unused |
|||
# (e.g. `TypeAdapter(Annotated[int, Field(alias='b')])`). In some cases, we |
|||
# end up building the type adapter from a model field annotation so we |
|||
# need to ignore the warning: |
|||
if shared.PYDANTIC_VERSION_MINOR_TUPLE >= (2, 12): |
|||
from pydantic.warnings import UnsupportedFieldAttributeWarning |
|||
|
|||
warnings.simplefilter( |
|||
"ignore", category=UnsupportedFieldAttributeWarning |
|||
) |
|||
self._type_adapter: TypeAdapter[Any] = TypeAdapter( |
|||
Annotated[self.field_info.annotation, self.field_info] |
|||
) |
|||
|
|||
def get_default(self) -> Any: |
|||
if self.field_info.is_required(): |
|||
return Undefined |
|||
return self.field_info.get_default(call_default_factory=True) |
|||
|
|||
def validate( |
|||
self, |
|||
value: Any, |
|||
values: Dict[str, Any] = {}, # noqa: B006 |
|||
*, |
|||
loc: Tuple[Union[int, str], ...] = (), |
|||
) -> Tuple[Any, Union[List[Dict[str, Any]], None]]: |
|||
try: |
|||
return ( |
|||
self._type_adapter.validate_python(value, from_attributes=True), |
|||
None, |
|||
) |
|||
except ValidationError as exc: |
|||
return None, v1._regenerate_error_with_loc( |
|||
errors=exc.errors(include_url=False), loc_prefix=loc |
|||
) |
|||
|
|||
def serialize( |
|||
self, |
|||
value: Any, |
|||
*, |
|||
mode: Literal["json", "python"] = "json", |
|||
include: Union[IncEx, None] = None, |
|||
exclude: Union[IncEx, None] = None, |
|||
by_alias: bool = True, |
|||
exclude_unset: bool = False, |
|||
exclude_defaults: bool = False, |
|||
exclude_none: bool = False, |
|||
) -> Any: |
|||
# What calls this code passes a value that already called |
|||
# self._type_adapter.validate_python(value) |
|||
return self._type_adapter.dump_python( |
|||
value, |
|||
mode=mode, |
|||
include=include, |
|||
exclude=exclude, |
|||
by_alias=by_alias, |
|||
exclude_unset=exclude_unset, |
|||
exclude_defaults=exclude_defaults, |
|||
exclude_none=exclude_none, |
|||
) |
|||
|
|||
def __hash__(self) -> int: |
|||
# Each ModelField is unique for our purposes, to allow making a dict from |
|||
# ModelField to its JSON Schema. |
|||
return id(self) |
|||
|
|||
|
|||
def get_annotation_from_field_info( |
|||
annotation: Any, field_info: FieldInfo, field_name: str |
|||
) -> Any: |
|||
return annotation |
|||
|
|||
|
|||
def _model_rebuild(model: Type[BaseModel]) -> None: |
|||
model.model_rebuild() |
|||
|
|||
|
|||
def _model_dump( |
|||
model: BaseModel, mode: Literal["json", "python"] = "json", **kwargs: Any |
|||
) -> Any: |
|||
return model.model_dump(mode=mode, **kwargs) |
|||
|
|||
|
|||
def _get_model_config(model: BaseModel) -> Any: |
|||
return model.model_config |
|||
|
|||
|
|||
def get_schema_from_model_field( |
|||
*, |
|||
field: ModelField, |
|||
model_name_map: ModelNameMap, |
|||
field_mapping: Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
], |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Dict[str, Any]: |
|||
override_mode: Union[Literal["validation"], None] = ( |
|||
None if separate_input_output_schemas else "validation" |
|||
) |
|||
# This expects that GenerateJsonSchema was already used to generate the definitions |
|||
json_schema = field_mapping[(field, override_mode or field.mode)] |
|||
if "$ref" not in json_schema: |
|||
# TODO remove when deprecating Pydantic v1 |
|||
# Ref: https://github.com/pydantic/pydantic/blob/d61792cc42c80b13b23e3ffa74bc37ec7c77f7d1/pydantic/schema.py#L207 |
|||
json_schema["title"] = field.field_info.title or field.alias.title().replace( |
|||
"_", " " |
|||
) |
|||
return json_schema |
|||
|
|||
|
|||
def get_definitions( |
|||
*, |
|||
fields: Sequence[ModelField], |
|||
model_name_map: ModelNameMap, |
|||
separate_input_output_schemas: bool = True, |
|||
) -> Tuple[ |
|||
Dict[Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue], |
|||
Dict[str, Dict[str, Any]], |
|||
]: |
|||
schema_generator = GenerateJsonSchema(ref_template=REF_TEMPLATE) |
|||
override_mode: Union[Literal["validation"], None] = ( |
|||
None if separate_input_output_schemas else "validation" |
|||
) |
|||
flat_models = get_flat_models_from_fields(fields, known_models=set()) |
|||
flat_model_fields = [ |
|||
ModelField(field_info=FieldInfo(annotation=model), name=model.__name__) |
|||
for model in flat_models |
|||
] |
|||
input_types = {f.type_ for f in fields} |
|||
unique_flat_model_fields = { |
|||
f for f in flat_model_fields if f.type_ not in input_types |
|||
} |
|||
|
|||
inputs = [ |
|||
(field, override_mode or field.mode, field._type_adapter.core_schema) |
|||
for field in list(fields) + list(unique_flat_model_fields) |
|||
] |
|||
field_mapping, definitions = schema_generator.generate_definitions(inputs=inputs) |
|||
for item_def in cast(Dict[str, Dict[str, Any]], definitions).values(): |
|||
if "description" in item_def: |
|||
item_description = cast(str, item_def["description"]).split("\f")[0] |
|||
item_def["description"] = item_description |
|||
new_mapping, new_definitions = _remap_definitions_and_field_mappings( |
|||
model_name_map=model_name_map, |
|||
definitions=definitions, # type: ignore[arg-type] |
|||
field_mapping=field_mapping, |
|||
) |
|||
return new_mapping, new_definitions |
|||
|
|||
|
|||
def _replace_refs( |
|||
*, |
|||
schema: Dict[str, Any], |
|||
old_name_to_new_name_map: Dict[str, str], |
|||
) -> Dict[str, Any]: |
|||
new_schema = deepcopy(schema) |
|||
for key, value in new_schema.items(): |
|||
if key == "$ref": |
|||
ref_name = schema["$ref"].split("/")[-1] |
|||
if ref_name in old_name_to_new_name_map: |
|||
new_name = old_name_to_new_name_map[ref_name] |
|||
new_schema["$ref"] = REF_TEMPLATE.format(model=new_name) |
|||
else: |
|||
new_schema["$ref"] = schema["$ref"] |
|||
continue |
|||
if isinstance(value, dict): |
|||
new_schema[key] = _replace_refs( |
|||
schema=value, |
|||
old_name_to_new_name_map=old_name_to_new_name_map, |
|||
) |
|||
elif isinstance(value, list): |
|||
new_value = [] |
|||
for item in value: |
|||
if isinstance(item, dict): |
|||
new_item = _replace_refs( |
|||
schema=item, |
|||
old_name_to_new_name_map=old_name_to_new_name_map, |
|||
) |
|||
new_value.append(new_item) |
|||
|
|||
else: |
|||
new_value.append(item) |
|||
new_schema[key] = new_value |
|||
return new_schema |
|||
|
|||
|
|||
def _remap_definitions_and_field_mappings( |
|||
*, |
|||
model_name_map: ModelNameMap, |
|||
definitions: Dict[str, Any], |
|||
field_mapping: Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
], |
|||
) -> Tuple[ |
|||
Dict[Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue], |
|||
Dict[str, Any], |
|||
]: |
|||
old_name_to_new_name_map = {} |
|||
for field_key, schema in field_mapping.items(): |
|||
model = field_key[0].type_ |
|||
if model not in model_name_map: |
|||
continue |
|||
new_name = model_name_map[model] |
|||
old_name = schema["$ref"].split("/")[-1] |
|||
if old_name in {f"{new_name}-Input", f"{new_name}-Output"}: |
|||
continue |
|||
old_name_to_new_name_map[old_name] = new_name |
|||
|
|||
new_field_mapping: Dict[ |
|||
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
|||
] = {} |
|||
for field_key, schema in field_mapping.items(): |
|||
new_schema = _replace_refs( |
|||
schema=schema, |
|||
old_name_to_new_name_map=old_name_to_new_name_map, |
|||
) |
|||
new_field_mapping[field_key] = new_schema |
|||
|
|||
new_definitions = {} |
|||
for key, value in definitions.items(): |
|||
if key in old_name_to_new_name_map: |
|||
new_key = old_name_to_new_name_map[key] |
|||
else: |
|||
new_key = key |
|||
new_value = _replace_refs( |
|||
schema=value, |
|||
old_name_to_new_name_map=old_name_to_new_name_map, |
|||
) |
|||
new_definitions[new_key] = new_value |
|||
return new_field_mapping, new_definitions |
|||
|
|||
|
|||
def is_scalar_field(field: ModelField) -> bool: |
|||
from fastapi import params |
|||
|
|||
return shared.field_annotation_is_scalar( |
|||
field.field_info.annotation |
|||
) and not isinstance(field.field_info, params.Body) |
|||
|
|||
|
|||
def is_sequence_field(field: ModelField) -> bool: |
|||
return shared.field_annotation_is_sequence(field.field_info.annotation) |
|||
|
|||
|
|||
def is_scalar_sequence_field(field: ModelField) -> bool: |
|||
return shared.field_annotation_is_scalar_sequence(field.field_info.annotation) |
|||
|
|||
|
|||
def is_bytes_field(field: ModelField) -> bool: |
|||
return shared.is_bytes_or_nonable_bytes_annotation(field.type_) |
|||
|
|||
|
|||
def is_bytes_sequence_field(field: ModelField) -> bool: |
|||
return shared.is_bytes_sequence_annotation(field.type_) |
|||
|
|||
|
|||
def copy_field_info(*, field_info: FieldInfo, annotation: Any) -> FieldInfo: |
|||
cls = type(field_info) |
|||
merged_field_info = cls.from_annotation(annotation) |
|||
new_field_info = copy(field_info) |
|||
new_field_info.metadata = merged_field_info.metadata |
|||
new_field_info.annotation = merged_field_info.annotation |
|||
return new_field_info |
|||
|
|||
|
|||
def serialize_sequence_value(*, field: ModelField, value: Any) -> Sequence[Any]: |
|||
origin_type = get_origin(field.field_info.annotation) or field.field_info.annotation |
|||
assert issubclass(origin_type, shared.sequence_types) # type: ignore[arg-type] |
|||
return shared.sequence_annotation_to_type[origin_type](value) # type: ignore[no-any-return] |
|||
|
|||
|
|||
def get_missing_field_error(loc: Tuple[str, ...]) -> Dict[str, Any]: |
|||
error = ValidationError.from_exception_data( |
|||
"Field required", [{"type": "missing", "loc": loc, "input": {}}] |
|||
).errors(include_url=False)[0] |
|||
error["input"] = None |
|||
return error # type: ignore[return-value] |
|||
|
|||
|
|||
def create_body_model( |
|||
*, fields: Sequence[ModelField], model_name: str |
|||
) -> Type[BaseModel]: |
|||
field_params = {f.name: (f.field_info.annotation, f.field_info) for f in fields} |
|||
BodyModel: Type[BaseModel] = create_model(model_name, **field_params) # type: ignore[call-overload] |
|||
return BodyModel |
|||
|
|||
|
|||
def get_model_fields(model: Type[BaseModel]) -> List[ModelField]: |
|||
return [ |
|||
ModelField(field_info=field_info, name=name) |
|||
for name, field_info in model.model_fields.items() |
|||
] |
|||
|
|||
|
|||
# Duplicate of several schema functions from Pydantic v1 to make them compatible with |
|||
# Pydantic v2 and allow mixing the models |
|||
|
|||
TypeModelOrEnum = Union[Type["BaseModel"], Type[Enum]] |
|||
TypeModelSet = Set[TypeModelOrEnum] |
|||
|
|||
|
|||
def normalize_name(name: str) -> str: |
|||
return re.sub(r"[^a-zA-Z0-9.\-_]", "_", name) |
|||
|
|||
|
|||
def get_model_name_map(unique_models: TypeModelSet) -> Dict[TypeModelOrEnum, str]: |
|||
name_model_map = {} |
|||
conflicting_names: Set[str] = set() |
|||
for model in unique_models: |
|||
model_name = normalize_name(model.__name__) |
|||
if model_name in conflicting_names: |
|||
model_name = get_long_model_name(model) |
|||
name_model_map[model_name] = model |
|||
elif model_name in name_model_map: |
|||
conflicting_names.add(model_name) |
|||
conflicting_model = name_model_map.pop(model_name) |
|||
name_model_map[get_long_model_name(conflicting_model)] = conflicting_model |
|||
name_model_map[get_long_model_name(model)] = model |
|||
else: |
|||
name_model_map[model_name] = model |
|||
return {v: k for k, v in name_model_map.items()} |
|||
|
|||
|
|||
def get_flat_models_from_model( |
|||
model: Type["BaseModel"], known_models: Union[TypeModelSet, None] = None |
|||
) -> TypeModelSet: |
|||
known_models = known_models or set() |
|||
fields = get_model_fields(model) |
|||
get_flat_models_from_fields(fields, known_models=known_models) |
|||
return known_models |
|||
|
|||
|
|||
def get_flat_models_from_annotation( |
|||
annotation: Any, known_models: TypeModelSet |
|||
) -> TypeModelSet: |
|||
origin = get_origin(annotation) |
|||
if origin is not None: |
|||
for arg in get_args(annotation): |
|||
if lenient_issubclass(arg, (BaseModel, Enum)) and arg not in known_models: |
|||
known_models.add(arg) |
|||
if lenient_issubclass(arg, BaseModel): |
|||
get_flat_models_from_model(arg, known_models=known_models) |
|||
else: |
|||
get_flat_models_from_annotation(arg, known_models=known_models) |
|||
return known_models |
|||
|
|||
|
|||
def get_flat_models_from_field( |
|||
field: ModelField, known_models: TypeModelSet |
|||
) -> TypeModelSet: |
|||
field_type = field.type_ |
|||
if lenient_issubclass(field_type, BaseModel): |
|||
if field_type in known_models: |
|||
return known_models |
|||
known_models.add(field_type) |
|||
get_flat_models_from_model(field_type, known_models=known_models) |
|||
elif lenient_issubclass(field_type, Enum): |
|||
known_models.add(field_type) |
|||
else: |
|||
get_flat_models_from_annotation(field_type, known_models=known_models) |
|||
return known_models |
|||
|
|||
|
|||
def get_flat_models_from_fields( |
|||
fields: Sequence[ModelField], known_models: TypeModelSet |
|||
) -> TypeModelSet: |
|||
for field in fields: |
|||
get_flat_models_from_field(field, known_models=known_models) |
|||
return known_models |
|||
|
|||
|
|||
def get_long_model_name(model: TypeModelOrEnum) -> str: |
|||
return f"{model.__module__}__{model.__qualname__}".replace(".", "__") |
@ -0,0 +1,724 @@ |
|||
import warnings |
|||
from typing import Any, Callable, Dict, List, Optional, Union |
|||
|
|||
from fastapi.openapi.models import Example |
|||
from fastapi.params import ParamTypes |
|||
from typing_extensions import Annotated, deprecated |
|||
|
|||
from ._compat.shared import PYDANTIC_VERSION_MINOR_TUPLE |
|||
from ._compat.v1 import FieldInfo, Undefined |
|||
|
|||
_Unset: Any = Undefined |
|||
|
|||
|
|||
class Param(FieldInfo): # type: ignore[misc] |
|||
in_: ParamTypes |
|||
|
|||
def __init__( |
|||
self, |
|||
default: Any = Undefined, |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
if example is not _Unset: |
|||
warnings.warn( |
|||
"`example` has been deprecated, please use `examples` instead", |
|||
category=DeprecationWarning, |
|||
stacklevel=4, |
|||
) |
|||
self.example = example |
|||
self.include_in_schema = include_in_schema |
|||
self.openapi_examples = openapi_examples |
|||
kwargs = dict( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
alias=alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
discriminator=discriminator, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
**extra, |
|||
) |
|||
if examples is not None: |
|||
kwargs["examples"] = examples |
|||
if regex is not None: |
|||
warnings.warn( |
|||
"`regex` has been deprecated, please use `pattern` instead", |
|||
category=DeprecationWarning, |
|||
stacklevel=4, |
|||
) |
|||
current_json_schema_extra = json_schema_extra or extra |
|||
if PYDANTIC_VERSION_MINOR_TUPLE < (2, 7): |
|||
self.deprecated = deprecated |
|||
else: |
|||
kwargs["deprecated"] = deprecated |
|||
kwargs["regex"] = pattern or regex |
|||
kwargs.update(**current_json_schema_extra) |
|||
use_kwargs = {k: v for k, v in kwargs.items() if v is not _Unset} |
|||
|
|||
super().__init__(**use_kwargs) |
|||
|
|||
def __repr__(self) -> str: |
|||
return f"{self.__class__.__name__}({self.default})" |
|||
|
|||
|
|||
class Path(Param): # type: ignore[misc] |
|||
in_ = ParamTypes.path |
|||
|
|||
def __init__( |
|||
self, |
|||
default: Any = ..., |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
assert default is ..., "Path parameters cannot have a default value" |
|||
self.in_ = self.in_ |
|||
super().__init__( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
annotation=annotation, |
|||
alias=alias, |
|||
alias_priority=alias_priority, |
|||
validation_alias=validation_alias, |
|||
serialization_alias=serialization_alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
pattern=pattern, |
|||
regex=regex, |
|||
discriminator=discriminator, |
|||
strict=strict, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
deprecated=deprecated, |
|||
example=example, |
|||
examples=examples, |
|||
openapi_examples=openapi_examples, |
|||
include_in_schema=include_in_schema, |
|||
json_schema_extra=json_schema_extra, |
|||
**extra, |
|||
) |
|||
|
|||
|
|||
class Query(Param): # type: ignore[misc] |
|||
in_ = ParamTypes.query |
|||
|
|||
def __init__( |
|||
self, |
|||
default: Any = Undefined, |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
super().__init__( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
annotation=annotation, |
|||
alias=alias, |
|||
alias_priority=alias_priority, |
|||
validation_alias=validation_alias, |
|||
serialization_alias=serialization_alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
pattern=pattern, |
|||
regex=regex, |
|||
discriminator=discriminator, |
|||
strict=strict, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
deprecated=deprecated, |
|||
example=example, |
|||
examples=examples, |
|||
openapi_examples=openapi_examples, |
|||
include_in_schema=include_in_schema, |
|||
json_schema_extra=json_schema_extra, |
|||
**extra, |
|||
) |
|||
|
|||
|
|||
class Header(Param): # type: ignore[misc] |
|||
in_ = ParamTypes.header |
|||
|
|||
def __init__( |
|||
self, |
|||
default: Any = Undefined, |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
convert_underscores: bool = True, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
self.convert_underscores = convert_underscores |
|||
super().__init__( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
annotation=annotation, |
|||
alias=alias, |
|||
alias_priority=alias_priority, |
|||
validation_alias=validation_alias, |
|||
serialization_alias=serialization_alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
pattern=pattern, |
|||
regex=regex, |
|||
discriminator=discriminator, |
|||
strict=strict, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
deprecated=deprecated, |
|||
example=example, |
|||
examples=examples, |
|||
openapi_examples=openapi_examples, |
|||
include_in_schema=include_in_schema, |
|||
json_schema_extra=json_schema_extra, |
|||
**extra, |
|||
) |
|||
|
|||
|
|||
class Cookie(Param): # type: ignore[misc] |
|||
in_ = ParamTypes.cookie |
|||
|
|||
def __init__( |
|||
self, |
|||
default: Any = Undefined, |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
super().__init__( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
annotation=annotation, |
|||
alias=alias, |
|||
alias_priority=alias_priority, |
|||
validation_alias=validation_alias, |
|||
serialization_alias=serialization_alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
pattern=pattern, |
|||
regex=regex, |
|||
discriminator=discriminator, |
|||
strict=strict, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
deprecated=deprecated, |
|||
example=example, |
|||
examples=examples, |
|||
openapi_examples=openapi_examples, |
|||
include_in_schema=include_in_schema, |
|||
json_schema_extra=json_schema_extra, |
|||
**extra, |
|||
) |
|||
|
|||
|
|||
class Body(FieldInfo): # type: ignore[misc] |
|||
def __init__( |
|||
self, |
|||
default: Any = Undefined, |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
embed: Union[bool, None] = None, |
|||
media_type: str = "application/json", |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
self.embed = embed |
|||
self.media_type = media_type |
|||
if example is not _Unset: |
|||
warnings.warn( |
|||
"`example` has been deprecated, please use `examples` instead", |
|||
category=DeprecationWarning, |
|||
stacklevel=4, |
|||
) |
|||
self.example = example |
|||
self.include_in_schema = include_in_schema |
|||
self.openapi_examples = openapi_examples |
|||
kwargs = dict( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
alias=alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
discriminator=discriminator, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
**extra, |
|||
) |
|||
if examples is not None: |
|||
kwargs["examples"] = examples |
|||
if regex is not None: |
|||
warnings.warn( |
|||
"`regex` has been deprecated, please use `pattern` instead", |
|||
category=DeprecationWarning, |
|||
stacklevel=4, |
|||
) |
|||
current_json_schema_extra = json_schema_extra or extra |
|||
if PYDANTIC_VERSION_MINOR_TUPLE < (2, 7): |
|||
self.deprecated = deprecated |
|||
else: |
|||
kwargs["deprecated"] = deprecated |
|||
kwargs["regex"] = pattern or regex |
|||
kwargs.update(**current_json_schema_extra) |
|||
|
|||
use_kwargs = {k: v for k, v in kwargs.items() if v is not _Unset} |
|||
|
|||
super().__init__(**use_kwargs) |
|||
|
|||
def __repr__(self) -> str: |
|||
return f"{self.__class__.__name__}({self.default})" |
|||
|
|||
|
|||
class Form(Body): # type: ignore[misc] |
|||
def __init__( |
|||
self, |
|||
default: Any = Undefined, |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
media_type: str = "application/x-www-form-urlencoded", |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
super().__init__( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
annotation=annotation, |
|||
media_type=media_type, |
|||
alias=alias, |
|||
alias_priority=alias_priority, |
|||
validation_alias=validation_alias, |
|||
serialization_alias=serialization_alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
pattern=pattern, |
|||
regex=regex, |
|||
discriminator=discriminator, |
|||
strict=strict, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
deprecated=deprecated, |
|||
example=example, |
|||
examples=examples, |
|||
openapi_examples=openapi_examples, |
|||
include_in_schema=include_in_schema, |
|||
json_schema_extra=json_schema_extra, |
|||
**extra, |
|||
) |
|||
|
|||
|
|||
class File(Form): # type: ignore[misc] |
|||
def __init__( |
|||
self, |
|||
default: Any = Undefined, |
|||
*, |
|||
default_factory: Union[Callable[[], Any], None] = _Unset, |
|||
annotation: Optional[Any] = None, |
|||
media_type: str = "multipart/form-data", |
|||
alias: Optional[str] = None, |
|||
alias_priority: Union[int, None] = _Unset, |
|||
# TODO: update when deprecating Pydantic v1, import these types |
|||
# validation_alias: str | AliasPath | AliasChoices | None |
|||
validation_alias: Union[str, None] = None, |
|||
serialization_alias: Union[str, None] = None, |
|||
title: Optional[str] = None, |
|||
description: Optional[str] = None, |
|||
gt: Optional[float] = None, |
|||
ge: Optional[float] = None, |
|||
lt: Optional[float] = None, |
|||
le: Optional[float] = None, |
|||
min_length: Optional[int] = None, |
|||
max_length: Optional[int] = None, |
|||
pattern: Optional[str] = None, |
|||
regex: Annotated[ |
|||
Optional[str], |
|||
deprecated( |
|||
"Deprecated in FastAPI 0.100.0 and Pydantic v2, use `pattern` instead." |
|||
), |
|||
] = None, |
|||
discriminator: Union[str, None] = None, |
|||
strict: Union[bool, None] = _Unset, |
|||
multiple_of: Union[float, None] = _Unset, |
|||
allow_inf_nan: Union[bool, None] = _Unset, |
|||
max_digits: Union[int, None] = _Unset, |
|||
decimal_places: Union[int, None] = _Unset, |
|||
examples: Optional[List[Any]] = None, |
|||
example: Annotated[ |
|||
Optional[Any], |
|||
deprecated( |
|||
"Deprecated in OpenAPI 3.1.0 that now uses JSON Schema 2020-12, " |
|||
"although still supported. Use examples instead." |
|||
), |
|||
] = _Unset, |
|||
openapi_examples: Optional[Dict[str, Example]] = None, |
|||
deprecated: Union[deprecated, str, bool, None] = None, |
|||
include_in_schema: bool = True, |
|||
json_schema_extra: Union[Dict[str, Any], None] = None, |
|||
**extra: Any, |
|||
): |
|||
super().__init__( |
|||
default=default, |
|||
default_factory=default_factory, |
|||
annotation=annotation, |
|||
media_type=media_type, |
|||
alias=alias, |
|||
alias_priority=alias_priority, |
|||
validation_alias=validation_alias, |
|||
serialization_alias=serialization_alias, |
|||
title=title, |
|||
description=description, |
|||
gt=gt, |
|||
ge=ge, |
|||
lt=lt, |
|||
le=le, |
|||
min_length=min_length, |
|||
max_length=max_length, |
|||
pattern=pattern, |
|||
regex=regex, |
|||
discriminator=discriminator, |
|||
strict=strict, |
|||
multiple_of=multiple_of, |
|||
allow_inf_nan=allow_inf_nan, |
|||
max_digits=max_digits, |
|||
decimal_places=decimal_places, |
|||
deprecated=deprecated, |
|||
example=example, |
|||
examples=examples, |
|||
openapi_examples=openapi_examples, |
|||
include_in_schema=include_in_schema, |
|||
json_schema_extra=json_schema_extra, |
|||
**extra, |
|||
) |
File diff suppressed because it is too large
@ -0,0 +1,475 @@ |
|||
import sys |
|||
from typing import Any, List, Union |
|||
|
|||
from tests.utils import pydantic_snapshot, skip_module_if_py_gte_314 |
|||
|
|||
if sys.version_info >= (3, 14): |
|||
skip_module_if_py_gte_314() |
|||
|
|||
from fastapi import FastAPI |
|||
from fastapi._compat.v1 import BaseModel |
|||
from fastapi.testclient import TestClient |
|||
from inline_snapshot import snapshot |
|||
|
|||
|
|||
class SubItem(BaseModel): |
|||
name: str |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
title: str |
|||
size: int |
|||
description: Union[str, None] = None |
|||
sub: SubItem |
|||
multi: List[SubItem] = [] |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/simple-model") |
|||
def handle_simple_model(data: SubItem) -> SubItem: |
|||
return data |
|||
|
|||
|
|||
@app.post("/simple-model-filter", response_model=SubItem) |
|||
def handle_simple_model_filter(data: SubItem) -> Any: |
|||
extended_data = data.dict() |
|||
extended_data.update({"secret_price": 42}) |
|||
return extended_data |
|||
|
|||
|
|||
@app.post("/item") |
|||
def handle_item(data: Item) -> Item: |
|||
return data |
|||
|
|||
|
|||
@app.post("/item-filter", response_model=Item) |
|||
def handle_item_filter(data: Item) -> Any: |
|||
extended_data = data.dict() |
|||
extended_data.update({"secret_data": "classified", "internal_id": 12345}) |
|||
extended_data["sub"].update({"internal_id": 67890}) |
|||
return extended_data |
|||
|
|||
|
|||
client = TestClient(app) |
|||
|
|||
|
|||
def test_old_simple_model(): |
|||
response = client.post( |
|||
"/simple-model", |
|||
json={"name": "Foo"}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == {"name": "Foo"} |
|||
|
|||
|
|||
def test_old_simple_model_validation_error(): |
|||
response = client.post( |
|||
"/simple-model", |
|||
json={"wrong_name": "Foo"}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
{ |
|||
"loc": ["body", "name"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_old_simple_model_filter(): |
|||
response = client.post( |
|||
"/simple-model-filter", |
|||
json={"name": "Foo"}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == {"name": "Foo"} |
|||
|
|||
|
|||
def test_item_model(): |
|||
response = client.post( |
|||
"/item", |
|||
json={ |
|||
"title": "Test Item", |
|||
"size": 100, |
|||
"description": "This is a test item", |
|||
"sub": {"name": "SubItem1"}, |
|||
"multi": [{"name": "Multi1"}, {"name": "Multi2"}], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"title": "Test Item", |
|||
"size": 100, |
|||
"description": "This is a test item", |
|||
"sub": {"name": "SubItem1"}, |
|||
"multi": [{"name": "Multi1"}, {"name": "Multi2"}], |
|||
} |
|||
|
|||
|
|||
def test_item_model_minimal(): |
|||
response = client.post( |
|||
"/item", |
|||
json={"title": "Minimal Item", "size": 50, "sub": {"name": "SubMin"}}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"title": "Minimal Item", |
|||
"size": 50, |
|||
"description": None, |
|||
"sub": {"name": "SubMin"}, |
|||
"multi": [], |
|||
} |
|||
|
|||
|
|||
def test_item_model_validation_errors(): |
|||
response = client.post( |
|||
"/item", |
|||
json={"title": "Missing fields"}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
error_detail = response.json()["detail"] |
|||
assert len(error_detail) == 2 |
|||
assert { |
|||
"loc": ["body", "size"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} in error_detail |
|||
assert { |
|||
"loc": ["body", "sub"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} in error_detail |
|||
|
|||
|
|||
def test_item_model_nested_validation_error(): |
|||
response = client.post( |
|||
"/item", |
|||
json={"title": "Test Item", "size": 100, "sub": {"wrong_field": "test"}}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
{ |
|||
"loc": ["body", "sub", "name"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_item_model_invalid_type(): |
|||
response = client.post( |
|||
"/item", |
|||
json={"title": "Test Item", "size": "not_a_number", "sub": {"name": "SubItem"}}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
{ |
|||
"loc": ["body", "size"], |
|||
"msg": "value is not a valid integer", |
|||
"type": "type_error.integer", |
|||
} |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_item_filter(): |
|||
response = client.post( |
|||
"/item-filter", |
|||
json={ |
|||
"title": "Filtered Item", |
|||
"size": 200, |
|||
"description": "Test filtering", |
|||
"sub": {"name": "SubFiltered"}, |
|||
"multi": [], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert result == { |
|||
"title": "Filtered Item", |
|||
"size": 200, |
|||
"description": "Test filtering", |
|||
"sub": {"name": "SubFiltered"}, |
|||
"multi": [], |
|||
} |
|||
assert "secret_data" not in result |
|||
assert "internal_id" not in result |
|||
|
|||
|
|||
def test_openapi_schema(): |
|||
response = client.get("/openapi.json") |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"openapi": "3.1.0", |
|||
"info": {"title": "FastAPI", "version": "0.1.0"}, |
|||
"paths": { |
|||
"/simple-model": { |
|||
"post": { |
|||
"summary": "Handle Simple Model", |
|||
"operationId": "handle_simple_model_simple_model_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/SubItem" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/SubItem"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/SubItem" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/simple-model-filter": { |
|||
"post": { |
|||
"summary": "Handle Simple Model Filter", |
|||
"operationId": "handle_simple_model_filter_simple_model_filter_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/SubItem" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/SubItem"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/SubItem" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/item": { |
|||
"post": { |
|||
"summary": "Handle Item", |
|||
"operationId": "handle_item_item_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/Item" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/Item"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/item-filter": { |
|||
"post": { |
|||
"summary": "Handle Item Filter", |
|||
"operationId": "handle_item_filter_item_filter_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/Item" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/Item"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
}, |
|||
"components": { |
|||
"schemas": { |
|||
"HTTPValidationError": { |
|||
"properties": { |
|||
"detail": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/ValidationError" |
|||
}, |
|||
"type": "array", |
|||
"title": "Detail", |
|||
} |
|||
}, |
|||
"type": "object", |
|||
"title": "HTTPValidationError", |
|||
}, |
|||
"Item": { |
|||
"properties": { |
|||
"title": {"type": "string", "title": "Title"}, |
|||
"size": {"type": "integer", "title": "Size"}, |
|||
"description": {"type": "string", "title": "Description"}, |
|||
"sub": {"$ref": "#/components/schemas/SubItem"}, |
|||
"multi": { |
|||
"items": {"$ref": "#/components/schemas/SubItem"}, |
|||
"type": "array", |
|||
"title": "Multi", |
|||
"default": [], |
|||
}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["title", "size", "sub"], |
|||
"title": "Item", |
|||
}, |
|||
"SubItem": { |
|||
"properties": {"name": {"type": "string", "title": "Name"}}, |
|||
"type": "object", |
|||
"required": ["name"], |
|||
"title": "SubItem", |
|||
}, |
|||
"ValidationError": { |
|||
"properties": { |
|||
"loc": { |
|||
"items": { |
|||
"anyOf": [{"type": "string"}, {"type": "integer"}] |
|||
}, |
|||
"type": "array", |
|||
"title": "Location", |
|||
}, |
|||
"msg": {"type": "string", "title": "Message"}, |
|||
"type": {"type": "string", "title": "Error Type"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["loc", "msg", "type"], |
|||
"title": "ValidationError", |
|||
}, |
|||
} |
|||
}, |
|||
} |
|||
) |
@ -0,0 +1,701 @@ |
|||
import sys |
|||
from typing import Any, List, Union |
|||
|
|||
from tests.utils import pydantic_snapshot, skip_module_if_py_gte_314 |
|||
|
|||
if sys.version_info >= (3, 14): |
|||
skip_module_if_py_gte_314() |
|||
|
|||
from fastapi import FastAPI |
|||
from fastapi._compat.v1 import BaseModel |
|||
from fastapi.testclient import TestClient |
|||
from inline_snapshot import snapshot |
|||
|
|||
|
|||
class SubItem(BaseModel): |
|||
name: str |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
title: str |
|||
size: int |
|||
description: Union[str, None] = None |
|||
sub: SubItem |
|||
multi: List[SubItem] = [] |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/item") |
|||
def handle_item(data: Item) -> List[Item]: |
|||
return [data, data] |
|||
|
|||
|
|||
@app.post("/item-filter", response_model=List[Item]) |
|||
def handle_item_filter(data: Item) -> Any: |
|||
extended_data = data.dict() |
|||
extended_data.update({"secret_data": "classified", "internal_id": 12345}) |
|||
extended_data["sub"].update({"internal_id": 67890}) |
|||
return [extended_data, extended_data] |
|||
|
|||
|
|||
@app.post("/item-list") |
|||
def handle_item_list(data: List[Item]) -> Item: |
|||
if data: |
|||
return data[0] |
|||
return Item(title="", size=0, sub=SubItem(name="")) |
|||
|
|||
|
|||
@app.post("/item-list-filter", response_model=Item) |
|||
def handle_item_list_filter(data: List[Item]) -> Any: |
|||
if data: |
|||
extended_data = data[0].dict() |
|||
extended_data.update({"secret_data": "classified", "internal_id": 12345}) |
|||
extended_data["sub"].update({"internal_id": 67890}) |
|||
return extended_data |
|||
return Item(title="", size=0, sub=SubItem(name="")) |
|||
|
|||
|
|||
@app.post("/item-list-to-list") |
|||
def handle_item_list_to_list(data: List[Item]) -> List[Item]: |
|||
return data |
|||
|
|||
|
|||
@app.post("/item-list-to-list-filter", response_model=List[Item]) |
|||
def handle_item_list_to_list_filter(data: List[Item]) -> Any: |
|||
if data: |
|||
extended_data = data[0].dict() |
|||
extended_data.update({"secret_data": "classified", "internal_id": 12345}) |
|||
extended_data["sub"].update({"internal_id": 67890}) |
|||
return [extended_data, extended_data] |
|||
return [] |
|||
|
|||
|
|||
client = TestClient(app) |
|||
|
|||
|
|||
def test_item_to_list(): |
|||
response = client.post( |
|||
"/item", |
|||
json={ |
|||
"title": "Test Item", |
|||
"size": 100, |
|||
"description": "This is a test item", |
|||
"sub": {"name": "SubItem1"}, |
|||
"multi": [{"name": "Multi1"}, {"name": "Multi2"}], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert isinstance(result, list) |
|||
assert len(result) == 2 |
|||
for item in result: |
|||
assert item == { |
|||
"title": "Test Item", |
|||
"size": 100, |
|||
"description": "This is a test item", |
|||
"sub": {"name": "SubItem1"}, |
|||
"multi": [{"name": "Multi1"}, {"name": "Multi2"}], |
|||
} |
|||
|
|||
|
|||
def test_item_to_list_filter(): |
|||
response = client.post( |
|||
"/item-filter", |
|||
json={ |
|||
"title": "Filtered Item", |
|||
"size": 200, |
|||
"description": "Test filtering", |
|||
"sub": {"name": "SubFiltered"}, |
|||
"multi": [], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert isinstance(result, list) |
|||
assert len(result) == 2 |
|||
for item in result: |
|||
assert item == { |
|||
"title": "Filtered Item", |
|||
"size": 200, |
|||
"description": "Test filtering", |
|||
"sub": {"name": "SubFiltered"}, |
|||
"multi": [], |
|||
} |
|||
# Verify secret fields are filtered out |
|||
assert "secret_data" not in item |
|||
assert "internal_id" not in item |
|||
assert "internal_id" not in item["sub"] |
|||
|
|||
|
|||
def test_list_to_item(): |
|||
response = client.post( |
|||
"/item-list", |
|||
json=[ |
|||
{"title": "First Item", "size": 50, "sub": {"name": "First Sub"}}, |
|||
{"title": "Second Item", "size": 75, "sub": {"name": "Second Sub"}}, |
|||
], |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"title": "First Item", |
|||
"size": 50, |
|||
"description": None, |
|||
"sub": {"name": "First Sub"}, |
|||
"multi": [], |
|||
} |
|||
|
|||
|
|||
def test_list_to_item_empty(): |
|||
response = client.post( |
|||
"/item-list", |
|||
json=[], |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"title": "", |
|||
"size": 0, |
|||
"description": None, |
|||
"sub": {"name": ""}, |
|||
"multi": [], |
|||
} |
|||
|
|||
|
|||
def test_list_to_item_filter(): |
|||
response = client.post( |
|||
"/item-list-filter", |
|||
json=[ |
|||
{ |
|||
"title": "First Item", |
|||
"size": 100, |
|||
"sub": {"name": "First Sub"}, |
|||
"multi": [{"name": "Multi1"}], |
|||
}, |
|||
{"title": "Second Item", "size": 200, "sub": {"name": "Second Sub"}}, |
|||
], |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert result == { |
|||
"title": "First Item", |
|||
"size": 100, |
|||
"description": None, |
|||
"sub": {"name": "First Sub"}, |
|||
"multi": [{"name": "Multi1"}], |
|||
} |
|||
# Verify secret fields are filtered out |
|||
assert "secret_data" not in result |
|||
assert "internal_id" not in result |
|||
|
|||
|
|||
def test_list_to_item_filter_no_data(): |
|||
response = client.post("/item-list-filter", json=[]) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"title": "", |
|||
"size": 0, |
|||
"description": None, |
|||
"sub": {"name": ""}, |
|||
"multi": [], |
|||
} |
|||
|
|||
|
|||
def test_list_to_list(): |
|||
input_items = [ |
|||
{"title": "Item 1", "size": 10, "sub": {"name": "Sub1"}}, |
|||
{ |
|||
"title": "Item 2", |
|||
"size": 20, |
|||
"description": "Second item", |
|||
"sub": {"name": "Sub2"}, |
|||
"multi": [{"name": "M1"}, {"name": "M2"}], |
|||
}, |
|||
{"title": "Item 3", "size": 30, "sub": {"name": "Sub3"}}, |
|||
] |
|||
response = client.post( |
|||
"/item-list-to-list", |
|||
json=input_items, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert isinstance(result, list) |
|||
assert len(result) == 3 |
|||
assert result[0] == { |
|||
"title": "Item 1", |
|||
"size": 10, |
|||
"description": None, |
|||
"sub": {"name": "Sub1"}, |
|||
"multi": [], |
|||
} |
|||
assert result[1] == { |
|||
"title": "Item 2", |
|||
"size": 20, |
|||
"description": "Second item", |
|||
"sub": {"name": "Sub2"}, |
|||
"multi": [{"name": "M1"}, {"name": "M2"}], |
|||
} |
|||
assert result[2] == { |
|||
"title": "Item 3", |
|||
"size": 30, |
|||
"description": None, |
|||
"sub": {"name": "Sub3"}, |
|||
"multi": [], |
|||
} |
|||
|
|||
|
|||
def test_list_to_list_filter(): |
|||
response = client.post( |
|||
"/item-list-to-list-filter", |
|||
json=[{"title": "Item 1", "size": 100, "sub": {"name": "Sub1"}}], |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert isinstance(result, list) |
|||
assert len(result) == 2 |
|||
for item in result: |
|||
assert item == { |
|||
"title": "Item 1", |
|||
"size": 100, |
|||
"description": None, |
|||
"sub": {"name": "Sub1"}, |
|||
"multi": [], |
|||
} |
|||
# Verify secret fields are filtered out |
|||
assert "secret_data" not in item |
|||
assert "internal_id" not in item |
|||
|
|||
|
|||
def test_list_to_list_filter_no_data(): |
|||
response = client.post( |
|||
"/item-list-to-list-filter", |
|||
json=[], |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == [] |
|||
|
|||
|
|||
def test_list_validation_error(): |
|||
response = client.post( |
|||
"/item-list", |
|||
json=[ |
|||
{"title": "Valid Item", "size": 100, "sub": {"name": "Sub1"}}, |
|||
{ |
|||
"title": "Invalid Item" |
|||
# Missing required fields: size and sub |
|||
}, |
|||
], |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
error_detail = response.json()["detail"] |
|||
assert len(error_detail) == 2 |
|||
assert { |
|||
"loc": ["body", 1, "size"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} in error_detail |
|||
assert { |
|||
"loc": ["body", 1, "sub"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} in error_detail |
|||
|
|||
|
|||
def test_list_nested_validation_error(): |
|||
response = client.post( |
|||
"/item-list", |
|||
json=[ |
|||
{"title": "Item with bad sub", "size": 100, "sub": {"wrong_field": "value"}} |
|||
], |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
{ |
|||
"loc": ["body", 0, "sub", "name"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_list_type_validation_error(): |
|||
response = client.post( |
|||
"/item-list", |
|||
json=[{"title": "Item", "size": "not_a_number", "sub": {"name": "Sub"}}], |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
{ |
|||
"loc": ["body", 0, "size"], |
|||
"msg": "value is not a valid integer", |
|||
"type": "type_error.integer", |
|||
} |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_invalid_list_structure(): |
|||
response = client.post( |
|||
"/item-list", |
|||
json={"title": "Not a list", "size": 100, "sub": {"name": "Sub"}}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
{ |
|||
"loc": ["body"], |
|||
"msg": "value is not a valid list", |
|||
"type": "type_error.list", |
|||
} |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_openapi_schema(): |
|||
response = client.get("/openapi.json") |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"openapi": "3.1.0", |
|||
"info": {"title": "FastAPI", "version": "0.1.0"}, |
|||
"paths": { |
|||
"/item": { |
|||
"post": { |
|||
"summary": "Handle Item", |
|||
"operationId": "handle_item_item_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/Item" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/Item"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/Item" |
|||
}, |
|||
"type": "array", |
|||
"title": "Response Handle Item Item Post", |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/item-filter": { |
|||
"post": { |
|||
"summary": "Handle Item Filter", |
|||
"operationId": "handle_item_filter_item_filter_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/Item" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/Item"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/Item" |
|||
}, |
|||
"type": "array", |
|||
"title": "Response Handle Item Filter Item Filter Post", |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/item-list": { |
|||
"post": { |
|||
"summary": "Handle Item List", |
|||
"operationId": "handle_item_list_item_list_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": {"$ref": "#/components/schemas/Item"}, |
|||
"type": "array", |
|||
"title": "Data", |
|||
} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/item-list-filter": { |
|||
"post": { |
|||
"summary": "Handle Item List Filter", |
|||
"operationId": "handle_item_list_filter_item_list_filter_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": {"$ref": "#/components/schemas/Item"}, |
|||
"type": "array", |
|||
"title": "Data", |
|||
} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/item-list-to-list": { |
|||
"post": { |
|||
"summary": "Handle Item List To List", |
|||
"operationId": "handle_item_list_to_list_item_list_to_list_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": {"$ref": "#/components/schemas/Item"}, |
|||
"type": "array", |
|||
"title": "Data", |
|||
} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/Item" |
|||
}, |
|||
"type": "array", |
|||
"title": "Response Handle Item List To List Item List To List Post", |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/item-list-to-list-filter": { |
|||
"post": { |
|||
"summary": "Handle Item List To List Filter", |
|||
"operationId": "handle_item_list_to_list_filter_item_list_to_list_filter_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": {"$ref": "#/components/schemas/Item"}, |
|||
"type": "array", |
|||
"title": "Data", |
|||
} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/Item" |
|||
}, |
|||
"type": "array", |
|||
"title": "Response Handle Item List To List Filter Item List To List Filter Post", |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
}, |
|||
"components": { |
|||
"schemas": { |
|||
"HTTPValidationError": { |
|||
"properties": { |
|||
"detail": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/ValidationError" |
|||
}, |
|||
"type": "array", |
|||
"title": "Detail", |
|||
} |
|||
}, |
|||
"type": "object", |
|||
"title": "HTTPValidationError", |
|||
}, |
|||
"Item": { |
|||
"properties": { |
|||
"title": {"type": "string", "title": "Title"}, |
|||
"size": {"type": "integer", "title": "Size"}, |
|||
"description": {"type": "string", "title": "Description"}, |
|||
"sub": {"$ref": "#/components/schemas/SubItem"}, |
|||
"multi": { |
|||
"items": {"$ref": "#/components/schemas/SubItem"}, |
|||
"type": "array", |
|||
"title": "Multi", |
|||
"default": [], |
|||
}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["title", "size", "sub"], |
|||
"title": "Item", |
|||
}, |
|||
"SubItem": { |
|||
"properties": {"name": {"type": "string", "title": "Name"}}, |
|||
"type": "object", |
|||
"required": ["name"], |
|||
"title": "SubItem", |
|||
}, |
|||
"ValidationError": { |
|||
"properties": { |
|||
"loc": { |
|||
"items": { |
|||
"anyOf": [{"type": "string"}, {"type": "integer"}] |
|||
}, |
|||
"type": "array", |
|||
"title": "Location", |
|||
}, |
|||
"msg": {"type": "string", "title": "Message"}, |
|||
"type": {"type": "string", "title": "Error Type"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["loc", "msg", "type"], |
|||
"title": "ValidationError", |
|||
}, |
|||
} |
|||
}, |
|||
} |
|||
) |
File diff suppressed because it is too large
@ -0,0 +1,142 @@ |
|||
from typing import List |
|||
|
|||
from fastapi import FastAPI |
|||
|
|||
from . import modelsv1, modelsv2, modelsv2b |
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/v1-to-v2/item") |
|||
def handle_v1_item_to_v2(data: modelsv1.Item) -> modelsv2.Item: |
|||
return modelsv2.Item( |
|||
new_title=data.title, |
|||
new_size=data.size, |
|||
new_description=data.description, |
|||
new_sub=modelsv2.SubItem(new_sub_name=data.sub.name), |
|||
new_multi=[modelsv2.SubItem(new_sub_name=s.name) for s in data.multi], |
|||
) |
|||
|
|||
|
|||
@app.post("/v2-to-v1/item") |
|||
def handle_v2_item_to_v1(data: modelsv2.Item) -> modelsv1.Item: |
|||
return modelsv1.Item( |
|||
title=data.new_title, |
|||
size=data.new_size, |
|||
description=data.new_description, |
|||
sub=modelsv1.SubItem(name=data.new_sub.new_sub_name), |
|||
multi=[modelsv1.SubItem(name=s.new_sub_name) for s in data.new_multi], |
|||
) |
|||
|
|||
|
|||
@app.post("/v1-to-v2/item-to-list") |
|||
def handle_v1_item_to_v2_list(data: modelsv1.Item) -> List[modelsv2.Item]: |
|||
converted = modelsv2.Item( |
|||
new_title=data.title, |
|||
new_size=data.size, |
|||
new_description=data.description, |
|||
new_sub=modelsv2.SubItem(new_sub_name=data.sub.name), |
|||
new_multi=[modelsv2.SubItem(new_sub_name=s.name) for s in data.multi], |
|||
) |
|||
return [converted, converted] |
|||
|
|||
|
|||
@app.post("/v1-to-v2/list-to-list") |
|||
def handle_v1_list_to_v2_list(data: List[modelsv1.Item]) -> List[modelsv2.Item]: |
|||
result = [] |
|||
for item in data: |
|||
result.append( |
|||
modelsv2.Item( |
|||
new_title=item.title, |
|||
new_size=item.size, |
|||
new_description=item.description, |
|||
new_sub=modelsv2.SubItem(new_sub_name=item.sub.name), |
|||
new_multi=[modelsv2.SubItem(new_sub_name=s.name) for s in item.multi], |
|||
) |
|||
) |
|||
return result |
|||
|
|||
|
|||
@app.post("/v1-to-v2/list-to-item") |
|||
def handle_v1_list_to_v2_item(data: List[modelsv1.Item]) -> modelsv2.Item: |
|||
if data: |
|||
item = data[0] |
|||
return modelsv2.Item( |
|||
new_title=item.title, |
|||
new_size=item.size, |
|||
new_description=item.description, |
|||
new_sub=modelsv2.SubItem(new_sub_name=item.sub.name), |
|||
new_multi=[modelsv2.SubItem(new_sub_name=s.name) for s in item.multi], |
|||
) |
|||
return modelsv2.Item( |
|||
new_title="", new_size=0, new_sub=modelsv2.SubItem(new_sub_name="") |
|||
) |
|||
|
|||
|
|||
@app.post("/v2-to-v1/item-to-list") |
|||
def handle_v2_item_to_v1_list(data: modelsv2.Item) -> List[modelsv1.Item]: |
|||
converted = modelsv1.Item( |
|||
title=data.new_title, |
|||
size=data.new_size, |
|||
description=data.new_description, |
|||
sub=modelsv1.SubItem(name=data.new_sub.new_sub_name), |
|||
multi=[modelsv1.SubItem(name=s.new_sub_name) for s in data.new_multi], |
|||
) |
|||
return [converted, converted] |
|||
|
|||
|
|||
@app.post("/v2-to-v1/list-to-list") |
|||
def handle_v2_list_to_v1_list(data: List[modelsv2.Item]) -> List[modelsv1.Item]: |
|||
result = [] |
|||
for item in data: |
|||
result.append( |
|||
modelsv1.Item( |
|||
title=item.new_title, |
|||
size=item.new_size, |
|||
description=item.new_description, |
|||
sub=modelsv1.SubItem(name=item.new_sub.new_sub_name), |
|||
multi=[modelsv1.SubItem(name=s.new_sub_name) for s in item.new_multi], |
|||
) |
|||
) |
|||
return result |
|||
|
|||
|
|||
@app.post("/v2-to-v1/list-to-item") |
|||
def handle_v2_list_to_v1_item(data: List[modelsv2.Item]) -> modelsv1.Item: |
|||
if data: |
|||
item = data[0] |
|||
return modelsv1.Item( |
|||
title=item.new_title, |
|||
size=item.new_size, |
|||
description=item.new_description, |
|||
sub=modelsv1.SubItem(name=item.new_sub.new_sub_name), |
|||
multi=[modelsv1.SubItem(name=s.new_sub_name) for s in item.new_multi], |
|||
) |
|||
return modelsv1.Item(title="", size=0, sub=modelsv1.SubItem(name="")) |
|||
|
|||
|
|||
@app.post("/v2-to-v1/same-name") |
|||
def handle_v2_same_name_to_v1( |
|||
item1: modelsv2.Item, item2: modelsv2b.Item |
|||
) -> modelsv1.Item: |
|||
return modelsv1.Item( |
|||
title=item1.new_title, |
|||
size=item2.dup_size, |
|||
description=item1.new_description, |
|||
sub=modelsv1.SubItem(name=item1.new_sub.new_sub_name), |
|||
multi=[modelsv1.SubItem(name=s.dup_sub_name) for s in item2.dup_multi], |
|||
) |
|||
|
|||
|
|||
@app.post("/v2-to-v1/list-of-items-to-list-of-items") |
|||
def handle_v2_items_in_list_to_v1_item_in_list( |
|||
data1: List[modelsv2.ItemInList], data2: List[modelsv2b.ItemInList] |
|||
) -> List[modelsv1.ItemInList]: |
|||
result = [] |
|||
item1 = data1[0] |
|||
item2 = data2[0] |
|||
result = [ |
|||
modelsv1.ItemInList(name1=item1.name2), |
|||
modelsv1.ItemInList(name1=item2.dup_name2), |
|||
] |
|||
return result |
@ -0,0 +1,19 @@ |
|||
from typing import List, Union |
|||
|
|||
from fastapi._compat.v1 import BaseModel |
|||
|
|||
|
|||
class SubItem(BaseModel): |
|||
name: str |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
title: str |
|||
size: int |
|||
description: Union[str, None] = None |
|||
sub: SubItem |
|||
multi: List[SubItem] = [] |
|||
|
|||
|
|||
class ItemInList(BaseModel): |
|||
name1: str |
@ -0,0 +1,19 @@ |
|||
from typing import List, Union |
|||
|
|||
from pydantic import BaseModel |
|||
|
|||
|
|||
class SubItem(BaseModel): |
|||
new_sub_name: str |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
new_title: str |
|||
new_size: int |
|||
new_description: Union[str, None] = None |
|||
new_sub: SubItem |
|||
new_multi: List[SubItem] = [] |
|||
|
|||
|
|||
class ItemInList(BaseModel): |
|||
name2: str |
@ -0,0 +1,19 @@ |
|||
from typing import List, Union |
|||
|
|||
from pydantic import BaseModel |
|||
|
|||
|
|||
class SubItem(BaseModel): |
|||
dup_sub_name: str |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
dup_title: str |
|||
dup_size: int |
|||
dup_description: Union[str, None] = None |
|||
dup_sub: SubItem |
|||
dup_multi: List[SubItem] = [] |
|||
|
|||
|
|||
class ItemInList(BaseModel): |
|||
dup_name2: str |
File diff suppressed because it is too large
@ -0,0 +1,766 @@ |
|||
import sys |
|||
from typing import Any, List, Union |
|||
|
|||
from tests.utils import pydantic_snapshot, skip_module_if_py_gte_314 |
|||
|
|||
if sys.version_info >= (3, 14): |
|||
skip_module_if_py_gte_314() |
|||
|
|||
from fastapi import FastAPI |
|||
from fastapi._compat.v1 import BaseModel |
|||
from fastapi.testclient import TestClient |
|||
from inline_snapshot import snapshot |
|||
from pydantic import BaseModel as NewBaseModel |
|||
|
|||
|
|||
class SubItem(BaseModel): |
|||
name: str |
|||
|
|||
|
|||
class Item(BaseModel): |
|||
title: str |
|||
size: int |
|||
description: Union[str, None] = None |
|||
sub: SubItem |
|||
multi: List[SubItem] = [] |
|||
|
|||
|
|||
class NewSubItem(NewBaseModel): |
|||
new_sub_name: str |
|||
|
|||
|
|||
class NewItem(NewBaseModel): |
|||
new_title: str |
|||
new_size: int |
|||
new_description: Union[str, None] = None |
|||
new_sub: NewSubItem |
|||
new_multi: List[NewSubItem] = [] |
|||
|
|||
|
|||
app = FastAPI() |
|||
|
|||
|
|||
@app.post("/v1-to-v2/") |
|||
def handle_v1_item_to_v2(data: Item) -> Union[NewItem, None]: |
|||
if data.size < 0: |
|||
return None |
|||
return NewItem( |
|||
new_title=data.title, |
|||
new_size=data.size, |
|||
new_description=data.description, |
|||
new_sub=NewSubItem(new_sub_name=data.sub.name), |
|||
new_multi=[NewSubItem(new_sub_name=s.name) for s in data.multi], |
|||
) |
|||
|
|||
|
|||
@app.post("/v1-to-v2/item-filter", response_model=Union[NewItem, None]) |
|||
def handle_v1_item_to_v2_filter(data: Item) -> Any: |
|||
if data.size < 0: |
|||
return None |
|||
result = { |
|||
"new_title": data.title, |
|||
"new_size": data.size, |
|||
"new_description": data.description, |
|||
"new_sub": {"new_sub_name": data.sub.name, "new_sub_secret": "sub_hidden"}, |
|||
"new_multi": [ |
|||
{"new_sub_name": s.name, "new_sub_secret": "sub_hidden"} for s in data.multi |
|||
], |
|||
"secret": "hidden_v1_to_v2", |
|||
} |
|||
return result |
|||
|
|||
|
|||
@app.post("/v2-to-v1/item") |
|||
def handle_v2_item_to_v1(data: NewItem) -> Union[Item, None]: |
|||
if data.new_size < 0: |
|||
return None |
|||
return Item( |
|||
title=data.new_title, |
|||
size=data.new_size, |
|||
description=data.new_description, |
|||
sub=SubItem(name=data.new_sub.new_sub_name), |
|||
multi=[SubItem(name=s.new_sub_name) for s in data.new_multi], |
|||
) |
|||
|
|||
|
|||
@app.post("/v2-to-v1/item-filter", response_model=Union[Item, None]) |
|||
def handle_v2_item_to_v1_filter(data: NewItem) -> Any: |
|||
if data.new_size < 0: |
|||
return None |
|||
result = { |
|||
"title": data.new_title, |
|||
"size": data.new_size, |
|||
"description": data.new_description, |
|||
"sub": {"name": data.new_sub.new_sub_name, "sub_secret": "sub_hidden"}, |
|||
"multi": [ |
|||
{"name": s.new_sub_name, "sub_secret": "sub_hidden"} for s in data.new_multi |
|||
], |
|||
"secret": "hidden_v2_to_v1", |
|||
} |
|||
return result |
|||
|
|||
|
|||
client = TestClient(app) |
|||
|
|||
|
|||
def test_v1_to_v2_item_success(): |
|||
response = client.post( |
|||
"/v1-to-v2/", |
|||
json={ |
|||
"title": "Old Item", |
|||
"size": 100, |
|||
"description": "V1 description", |
|||
"sub": {"name": "V1 Sub"}, |
|||
"multi": [{"name": "M1"}, {"name": "M2"}], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"new_title": "Old Item", |
|||
"new_size": 100, |
|||
"new_description": "V1 description", |
|||
"new_sub": {"new_sub_name": "V1 Sub"}, |
|||
"new_multi": [{"new_sub_name": "M1"}, {"new_sub_name": "M2"}], |
|||
} |
|||
|
|||
|
|||
def test_v1_to_v2_item_returns_none(): |
|||
response = client.post( |
|||
"/v1-to-v2/", |
|||
json={"title": "Invalid Item", "size": -10, "sub": {"name": "Sub"}}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() is None |
|||
|
|||
|
|||
def test_v1_to_v2_item_minimal(): |
|||
response = client.post( |
|||
"/v1-to-v2/", json={"title": "Minimal", "size": 50, "sub": {"name": "MinSub"}} |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"new_title": "Minimal", |
|||
"new_size": 50, |
|||
"new_description": None, |
|||
"new_sub": {"new_sub_name": "MinSub"}, |
|||
"new_multi": [], |
|||
} |
|||
|
|||
|
|||
def test_v1_to_v2_item_filter_success(): |
|||
response = client.post( |
|||
"/v1-to-v2/item-filter", |
|||
json={ |
|||
"title": "Filtered Item", |
|||
"size": 50, |
|||
"sub": {"name": "Sub"}, |
|||
"multi": [{"name": "Multi1"}], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert result["new_title"] == "Filtered Item" |
|||
assert result["new_size"] == 50 |
|||
assert result["new_sub"]["new_sub_name"] == "Sub" |
|||
assert result["new_multi"][0]["new_sub_name"] == "Multi1" |
|||
# Verify secret fields are filtered out |
|||
assert "secret" not in result |
|||
assert "new_sub_secret" not in result["new_sub"] |
|||
assert "new_sub_secret" not in result["new_multi"][0] |
|||
|
|||
|
|||
def test_v1_to_v2_item_filter_returns_none(): |
|||
response = client.post( |
|||
"/v1-to-v2/item-filter", |
|||
json={"title": "Invalid", "size": -1, "sub": {"name": "Sub"}}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() is None |
|||
|
|||
|
|||
def test_v2_to_v1_item_success(): |
|||
response = client.post( |
|||
"/v2-to-v1/item", |
|||
json={ |
|||
"new_title": "New Item", |
|||
"new_size": 200, |
|||
"new_description": "V2 description", |
|||
"new_sub": {"new_sub_name": "V2 Sub"}, |
|||
"new_multi": [{"new_sub_name": "N1"}, {"new_sub_name": "N2"}], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"title": "New Item", |
|||
"size": 200, |
|||
"description": "V2 description", |
|||
"sub": {"name": "V2 Sub"}, |
|||
"multi": [{"name": "N1"}, {"name": "N2"}], |
|||
} |
|||
|
|||
|
|||
def test_v2_to_v1_item_returns_none(): |
|||
response = client.post( |
|||
"/v2-to-v1/item", |
|||
json={ |
|||
"new_title": "Invalid New", |
|||
"new_size": -5, |
|||
"new_sub": {"new_sub_name": "NewSub"}, |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() is None |
|||
|
|||
|
|||
def test_v2_to_v1_item_minimal(): |
|||
response = client.post( |
|||
"/v2-to-v1/item", |
|||
json={ |
|||
"new_title": "MinimalNew", |
|||
"new_size": 75, |
|||
"new_sub": {"new_sub_name": "MinNewSub"}, |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"title": "MinimalNew", |
|||
"size": 75, |
|||
"description": None, |
|||
"sub": {"name": "MinNewSub"}, |
|||
"multi": [], |
|||
} |
|||
|
|||
|
|||
def test_v2_to_v1_item_filter_success(): |
|||
response = client.post( |
|||
"/v2-to-v1/item-filter", |
|||
json={ |
|||
"new_title": "Filtered New", |
|||
"new_size": 75, |
|||
"new_sub": {"new_sub_name": "NewSub"}, |
|||
"new_multi": [], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
result = response.json() |
|||
assert result["title"] == "Filtered New" |
|||
assert result["size"] == 75 |
|||
assert result["sub"]["name"] == "NewSub" |
|||
# Verify secret fields are filtered out |
|||
assert "secret" not in result |
|||
assert "sub_secret" not in result["sub"] |
|||
|
|||
|
|||
def test_v2_to_v1_item_filter_returns_none(): |
|||
response = client.post( |
|||
"/v2-to-v1/item-filter", |
|||
json={ |
|||
"new_title": "Invalid Filtered", |
|||
"new_size": -100, |
|||
"new_sub": {"new_sub_name": "Sub"}, |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() is None |
|||
|
|||
|
|||
def test_v1_to_v2_validation_error(): |
|||
response = client.post("/v1-to-v2/", json={"title": "Missing fields"}) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
{ |
|||
"loc": ["body", "size"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
}, |
|||
{ |
|||
"loc": ["body", "sub"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
}, |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_v1_to_v2_nested_validation_error(): |
|||
response = client.post( |
|||
"/v1-to-v2/", |
|||
json={"title": "Bad sub", "size": 100, "sub": {"wrong_field": "value"}}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
error_detail = response.json()["detail"] |
|||
assert len(error_detail) == 1 |
|||
assert error_detail[0]["loc"] == ["body", "sub", "name"] |
|||
|
|||
|
|||
def test_v1_to_v2_type_validation_error(): |
|||
response = client.post( |
|||
"/v1-to-v2/", |
|||
json={"title": "Bad type", "size": "not_a_number", "sub": {"name": "Sub"}}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
error_detail = response.json()["detail"] |
|||
assert len(error_detail) == 1 |
|||
assert error_detail[0]["loc"] == ["body", "size"] |
|||
|
|||
|
|||
def test_v2_to_v1_validation_error(): |
|||
response = client.post("/v2-to-v1/item", json={"new_title": "Missing fields"}) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": pydantic_snapshot( |
|||
v2=snapshot( |
|||
[ |
|||
{ |
|||
"type": "missing", |
|||
"loc": ["body", "new_size"], |
|||
"msg": "Field required", |
|||
"input": {"new_title": "Missing fields"}, |
|||
}, |
|||
{ |
|||
"type": "missing", |
|||
"loc": ["body", "new_sub"], |
|||
"msg": "Field required", |
|||
"input": {"new_title": "Missing fields"}, |
|||
}, |
|||
] |
|||
), |
|||
v1=snapshot( |
|||
[ |
|||
{ |
|||
"loc": ["body", "new_size"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
}, |
|||
{ |
|||
"loc": ["body", "new_sub"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
}, |
|||
] |
|||
), |
|||
) |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_v2_to_v1_nested_validation_error(): |
|||
response = client.post( |
|||
"/v2-to-v1/item", |
|||
json={ |
|||
"new_title": "Bad sub", |
|||
"new_size": 200, |
|||
"new_sub": {"wrong_field": "value"}, |
|||
}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"type": "missing", |
|||
"loc": ["body", "new_sub", "new_sub_name"], |
|||
"msg": "Field required", |
|||
"input": {"wrong_field": "value"}, |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{ |
|||
"loc": ["body", "new_sub", "new_sub_name"], |
|||
"msg": "field required", |
|||
"type": "value_error.missing", |
|||
} |
|||
), |
|||
) |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_v2_to_v1_type_validation_error(): |
|||
response = client.post( |
|||
"/v2-to-v1/item", |
|||
json={ |
|||
"new_title": "Bad type", |
|||
"new_size": "not_a_number", |
|||
"new_sub": {"new_sub_name": "Sub"}, |
|||
}, |
|||
) |
|||
assert response.status_code == 422, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"detail": [ |
|||
pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"type": "int_parsing", |
|||
"loc": ["body", "new_size"], |
|||
"msg": "Input should be a valid integer, unable to parse string as an integer", |
|||
"input": "not_a_number", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{ |
|||
"loc": ["body", "new_size"], |
|||
"msg": "value is not a valid integer", |
|||
"type": "type_error.integer", |
|||
} |
|||
), |
|||
) |
|||
] |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_v1_to_v2_with_multi_items(): |
|||
response = client.post( |
|||
"/v1-to-v2/", |
|||
json={ |
|||
"title": "Complex Item", |
|||
"size": 300, |
|||
"description": "Item with multiple sub-items", |
|||
"sub": {"name": "Main Sub"}, |
|||
"multi": [{"name": "Sub1"}, {"name": "Sub2"}, {"name": "Sub3"}], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"new_title": "Complex Item", |
|||
"new_size": 300, |
|||
"new_description": "Item with multiple sub-items", |
|||
"new_sub": {"new_sub_name": "Main Sub"}, |
|||
"new_multi": [ |
|||
{"new_sub_name": "Sub1"}, |
|||
{"new_sub_name": "Sub2"}, |
|||
{"new_sub_name": "Sub3"}, |
|||
], |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_v2_to_v1_with_multi_items(): |
|||
response = client.post( |
|||
"/v2-to-v1/item", |
|||
json={ |
|||
"new_title": "Complex New Item", |
|||
"new_size": 400, |
|||
"new_description": "New item with multiple sub-items", |
|||
"new_sub": {"new_sub_name": "Main New Sub"}, |
|||
"new_multi": [{"new_sub_name": "NewSub1"}, {"new_sub_name": "NewSub2"}], |
|||
}, |
|||
) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"title": "Complex New Item", |
|||
"size": 400, |
|||
"description": "New item with multiple sub-items", |
|||
"sub": {"name": "Main New Sub"}, |
|||
"multi": [{"name": "NewSub1"}, {"name": "NewSub2"}], |
|||
} |
|||
) |
|||
|
|||
|
|||
def test_openapi_schema(): |
|||
response = client.get("/openapi.json") |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"openapi": "3.1.0", |
|||
"info": {"title": "FastAPI", "version": "0.1.0"}, |
|||
"paths": { |
|||
"/v1-to-v2/": { |
|||
"post": { |
|||
"summary": "Handle V1 Item To V2", |
|||
"operationId": "handle_v1_item_to_v2_v1_to_v2__post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/Item" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/Item"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"anyOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/NewItem" |
|||
}, |
|||
{"type": "null"}, |
|||
], |
|||
"title": "Response Handle V1 Item To V2 V1 To V2 Post", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/NewItem"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/v1-to-v2/item-filter": { |
|||
"post": { |
|||
"summary": "Handle V1 Item To V2 Filter", |
|||
"operationId": "handle_v1_item_to_v2_filter_v1_to_v2_item_filter_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/Item" |
|||
} |
|||
], |
|||
"title": "Data", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/Item"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"anyOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/NewItem" |
|||
}, |
|||
{"type": "null"}, |
|||
], |
|||
"title": "Response Handle V1 Item To V2 Filter V1 To V2 Item Filter Post", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"$ref": "#/components/schemas/NewItem"} |
|||
), |
|||
) |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/v2-to-v1/item": { |
|||
"post": { |
|||
"summary": "Handle V2 Item To V1", |
|||
"operationId": "handle_v2_item_to_v1_v2_to_v1_item_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/NewItem"} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
"/v2-to-v1/item-filter": { |
|||
"post": { |
|||
"summary": "Handle V2 Item To V1 Filter", |
|||
"operationId": "handle_v2_item_to_v1_filter_v2_to_v1_item_filter_post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/NewItem"} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
}, |
|||
}, |
|||
"components": { |
|||
"schemas": { |
|||
"HTTPValidationError": { |
|||
"properties": { |
|||
"detail": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/ValidationError" |
|||
}, |
|||
"type": "array", |
|||
"title": "Detail", |
|||
} |
|||
}, |
|||
"type": "object", |
|||
"title": "HTTPValidationError", |
|||
}, |
|||
"Item": { |
|||
"properties": { |
|||
"title": {"type": "string", "title": "Title"}, |
|||
"size": {"type": "integer", "title": "Size"}, |
|||
"description": {"type": "string", "title": "Description"}, |
|||
"sub": {"$ref": "#/components/schemas/SubItem"}, |
|||
"multi": { |
|||
"items": {"$ref": "#/components/schemas/SubItem"}, |
|||
"type": "array", |
|||
"title": "Multi", |
|||
"default": [], |
|||
}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["title", "size", "sub"], |
|||
"title": "Item", |
|||
}, |
|||
"NewItem": { |
|||
"properties": { |
|||
"new_title": {"type": "string", "title": "New Title"}, |
|||
"new_size": {"type": "integer", "title": "New Size"}, |
|||
"new_description": pydantic_snapshot( |
|||
v2=snapshot( |
|||
{ |
|||
"anyOf": [{"type": "string"}, {"type": "null"}], |
|||
"title": "New Description", |
|||
} |
|||
), |
|||
v1=snapshot( |
|||
{"type": "string", "title": "New Description"} |
|||
), |
|||
), |
|||
"new_sub": {"$ref": "#/components/schemas/NewSubItem"}, |
|||
"new_multi": { |
|||
"items": {"$ref": "#/components/schemas/NewSubItem"}, |
|||
"type": "array", |
|||
"title": "New Multi", |
|||
"default": [], |
|||
}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["new_title", "new_size", "new_sub"], |
|||
"title": "NewItem", |
|||
}, |
|||
"NewSubItem": { |
|||
"properties": { |
|||
"new_sub_name": {"type": "string", "title": "New Sub Name"} |
|||
}, |
|||
"type": "object", |
|||
"required": ["new_sub_name"], |
|||
"title": "NewSubItem", |
|||
}, |
|||
"SubItem": { |
|||
"properties": {"name": {"type": "string", "title": "Name"}}, |
|||
"type": "object", |
|||
"required": ["name"], |
|||
"title": "SubItem", |
|||
}, |
|||
"ValidationError": { |
|||
"properties": { |
|||
"loc": { |
|||
"items": { |
|||
"anyOf": [{"type": "string"}, {"type": "integer"}] |
|||
}, |
|||
"type": "array", |
|||
"title": "Location", |
|||
}, |
|||
"msg": {"type": "string", "title": "Message"}, |
|||
"type": {"type": "string", "title": "Error Type"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["loc", "msg", "type"], |
|||
"title": "ValidationError", |
|||
}, |
|||
} |
|||
}, |
|||
} |
|||
) |
@ -0,0 +1,37 @@ |
|||
import sys |
|||
from typing import Any |
|||
|
|||
import pytest |
|||
from fastapi._compat import PYDANTIC_V2 |
|||
|
|||
from tests.utils import skip_module_if_py_gte_314 |
|||
|
|||
if sys.version_info >= (3, 14): |
|||
skip_module_if_py_gte_314() |
|||
|
|||
|
|||
if not PYDANTIC_V2: |
|||
pytest.skip("This test is only for Pydantic v2", allow_module_level=True) |
|||
|
|||
import importlib |
|||
|
|||
import pytest |
|||
|
|||
from ...utils import needs_py310 |
|||
|
|||
|
|||
@pytest.fixture( |
|||
name="mod", |
|||
params=[ |
|||
"tutorial001_an", |
|||
pytest.param("tutorial001_an_py310", marks=needs_py310), |
|||
], |
|||
) |
|||
def get_mod(request: pytest.FixtureRequest): |
|||
mod = importlib.import_module(f"docs_src.pydantic_v1_in_v2.{request.param}") |
|||
return mod |
|||
|
|||
|
|||
def test_model(mod: Any): |
|||
item = mod.Item(name="Foo", size=3.4) |
|||
assert item.dict() == {"name": "Foo", "description": None, "size": 3.4} |
@ -0,0 +1,140 @@ |
|||
import sys |
|||
|
|||
import pytest |
|||
from fastapi._compat import PYDANTIC_V2 |
|||
from inline_snapshot import snapshot |
|||
|
|||
from tests.utils import skip_module_if_py_gte_314 |
|||
|
|||
if sys.version_info >= (3, 14): |
|||
skip_module_if_py_gte_314() |
|||
|
|||
|
|||
if not PYDANTIC_V2: |
|||
pytest.skip("This test is only for Pydantic v2", allow_module_level=True) |
|||
|
|||
import importlib |
|||
|
|||
import pytest |
|||
from fastapi.testclient import TestClient |
|||
|
|||
from ...utils import needs_py310 |
|||
|
|||
|
|||
@pytest.fixture( |
|||
name="client", |
|||
params=[ |
|||
"tutorial002_an", |
|||
pytest.param("tutorial002_an_py310", marks=needs_py310), |
|||
], |
|||
) |
|||
def get_client(request: pytest.FixtureRequest): |
|||
mod = importlib.import_module(f"docs_src.pydantic_v1_in_v2.{request.param}") |
|||
|
|||
c = TestClient(mod.app) |
|||
return c |
|||
|
|||
|
|||
def test_call(client: TestClient): |
|||
response = client.post("/items/", json={"name": "Foo", "size": 3.4}) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"name": "Foo", |
|||
"description": None, |
|||
"size": 3.4, |
|||
} |
|||
|
|||
|
|||
def test_openapi_schema(client: TestClient): |
|||
response = client.get("/openapi.json") |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"openapi": "3.1.0", |
|||
"info": {"title": "FastAPI", "version": "0.1.0"}, |
|||
"paths": { |
|||
"/items/": { |
|||
"post": { |
|||
"summary": "Create Item", |
|||
"operationId": "create_item_items__post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"allOf": [ |
|||
{"$ref": "#/components/schemas/Item"} |
|||
], |
|||
"title": "Item", |
|||
} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
} |
|||
}, |
|||
"components": { |
|||
"schemas": { |
|||
"HTTPValidationError": { |
|||
"properties": { |
|||
"detail": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/ValidationError" |
|||
}, |
|||
"type": "array", |
|||
"title": "Detail", |
|||
} |
|||
}, |
|||
"type": "object", |
|||
"title": "HTTPValidationError", |
|||
}, |
|||
"Item": { |
|||
"properties": { |
|||
"name": {"type": "string", "title": "Name"}, |
|||
"description": {"type": "string", "title": "Description"}, |
|||
"size": {"type": "number", "title": "Size"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["name", "size"], |
|||
"title": "Item", |
|||
}, |
|||
"ValidationError": { |
|||
"properties": { |
|||
"loc": { |
|||
"items": { |
|||
"anyOf": [{"type": "string"}, {"type": "integer"}] |
|||
}, |
|||
"type": "array", |
|||
"title": "Location", |
|||
}, |
|||
"msg": {"type": "string", "title": "Message"}, |
|||
"type": {"type": "string", "title": "Error Type"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["loc", "msg", "type"], |
|||
"title": "ValidationError", |
|||
}, |
|||
} |
|||
}, |
|||
} |
|||
) |
@ -0,0 +1,154 @@ |
|||
import sys |
|||
|
|||
import pytest |
|||
from fastapi._compat import PYDANTIC_V2 |
|||
from inline_snapshot import snapshot |
|||
|
|||
from tests.utils import skip_module_if_py_gte_314 |
|||
|
|||
if sys.version_info >= (3, 14): |
|||
skip_module_if_py_gte_314() |
|||
|
|||
if not PYDANTIC_V2: |
|||
pytest.skip("This test is only for Pydantic v2", allow_module_level=True) |
|||
|
|||
|
|||
import importlib |
|||
|
|||
from fastapi.testclient import TestClient |
|||
|
|||
from ...utils import needs_py310 |
|||
|
|||
|
|||
@pytest.fixture( |
|||
name="client", |
|||
params=[ |
|||
"tutorial003_an", |
|||
pytest.param("tutorial003_an_py310", marks=needs_py310), |
|||
], |
|||
) |
|||
def get_client(request: pytest.FixtureRequest): |
|||
mod = importlib.import_module(f"docs_src.pydantic_v1_in_v2.{request.param}") |
|||
|
|||
c = TestClient(mod.app) |
|||
return c |
|||
|
|||
|
|||
def test_call(client: TestClient): |
|||
response = client.post("/items/", json={"name": "Foo", "size": 3.4}) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"name": "Foo", |
|||
"description": None, |
|||
"size": 3.4, |
|||
} |
|||
|
|||
|
|||
def test_openapi_schema(client: TestClient): |
|||
response = client.get("/openapi.json") |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"openapi": "3.1.0", |
|||
"info": {"title": "FastAPI", "version": "0.1.0"}, |
|||
"paths": { |
|||
"/items/": { |
|||
"post": { |
|||
"summary": "Create Item", |
|||
"operationId": "create_item_items__post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"allOf": [ |
|||
{"$ref": "#/components/schemas/Item"} |
|||
], |
|||
"title": "Item", |
|||
} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/ItemV2" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
} |
|||
}, |
|||
"components": { |
|||
"schemas": { |
|||
"HTTPValidationError": { |
|||
"properties": { |
|||
"detail": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/ValidationError" |
|||
}, |
|||
"type": "array", |
|||
"title": "Detail", |
|||
} |
|||
}, |
|||
"type": "object", |
|||
"title": "HTTPValidationError", |
|||
}, |
|||
"Item": { |
|||
"properties": { |
|||
"name": {"type": "string", "title": "Name"}, |
|||
"description": {"type": "string", "title": "Description"}, |
|||
"size": {"type": "number", "title": "Size"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["name", "size"], |
|||
"title": "Item", |
|||
}, |
|||
"ItemV2": { |
|||
"properties": { |
|||
"name": {"type": "string", "title": "Name"}, |
|||
"description": { |
|||
"anyOf": [{"type": "string"}, {"type": "null"}], |
|||
"title": "Description", |
|||
}, |
|||
"size": {"type": "number", "title": "Size"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["name", "size"], |
|||
"title": "ItemV2", |
|||
}, |
|||
"ValidationError": { |
|||
"properties": { |
|||
"loc": { |
|||
"items": { |
|||
"anyOf": [{"type": "string"}, {"type": "integer"}] |
|||
}, |
|||
"type": "array", |
|||
"title": "Location", |
|||
}, |
|||
"msg": {"type": "string", "title": "Message"}, |
|||
"type": {"type": "string", "title": "Error Type"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["loc", "msg", "type"], |
|||
"title": "ValidationError", |
|||
}, |
|||
} |
|||
}, |
|||
} |
|||
) |
@ -0,0 +1,153 @@ |
|||
import sys |
|||
|
|||
import pytest |
|||
from fastapi._compat import PYDANTIC_V2 |
|||
from inline_snapshot import snapshot |
|||
|
|||
from tests.utils import skip_module_if_py_gte_314 |
|||
|
|||
if sys.version_info >= (3, 14): |
|||
skip_module_if_py_gte_314() |
|||
|
|||
if not PYDANTIC_V2: |
|||
pytest.skip("This test is only for Pydantic v2", allow_module_level=True) |
|||
|
|||
|
|||
import importlib |
|||
|
|||
from fastapi.testclient import TestClient |
|||
|
|||
from ...utils import needs_py39, needs_py310 |
|||
|
|||
|
|||
@pytest.fixture( |
|||
name="client", |
|||
params=[ |
|||
"tutorial004_an", |
|||
pytest.param("tutorial004_an_py39", marks=needs_py39), |
|||
pytest.param("tutorial004_an_py310", marks=needs_py310), |
|||
], |
|||
) |
|||
def get_client(request: pytest.FixtureRequest): |
|||
mod = importlib.import_module(f"docs_src.pydantic_v1_in_v2.{request.param}") |
|||
|
|||
c = TestClient(mod.app) |
|||
return c |
|||
|
|||
|
|||
def test_call(client: TestClient): |
|||
response = client.post("/items/", json={"item": {"name": "Foo", "size": 3.4}}) |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == { |
|||
"name": "Foo", |
|||
"description": None, |
|||
"size": 3.4, |
|||
} |
|||
|
|||
|
|||
def test_openapi_schema(client: TestClient): |
|||
response = client.get("/openapi.json") |
|||
assert response.status_code == 200, response.text |
|||
assert response.json() == snapshot( |
|||
{ |
|||
"openapi": "3.1.0", |
|||
"info": {"title": "FastAPI", "version": "0.1.0"}, |
|||
"paths": { |
|||
"/items/": { |
|||
"post": { |
|||
"summary": "Create Item", |
|||
"operationId": "create_item_items__post", |
|||
"requestBody": { |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"allOf": [ |
|||
{ |
|||
"$ref": "#/components/schemas/Body_create_item_items__post" |
|||
} |
|||
], |
|||
"title": "Body", |
|||
} |
|||
} |
|||
}, |
|||
"required": True, |
|||
}, |
|||
"responses": { |
|||
"200": { |
|||
"description": "Successful Response", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": {"$ref": "#/components/schemas/Item"} |
|||
} |
|||
}, |
|||
}, |
|||
"422": { |
|||
"description": "Validation Error", |
|||
"content": { |
|||
"application/json": { |
|||
"schema": { |
|||
"$ref": "#/components/schemas/HTTPValidationError" |
|||
} |
|||
} |
|||
}, |
|||
}, |
|||
}, |
|||
} |
|||
} |
|||
}, |
|||
"components": { |
|||
"schemas": { |
|||
"Body_create_item_items__post": { |
|||
"properties": { |
|||
"item": { |
|||
"allOf": [{"$ref": "#/components/schemas/Item"}], |
|||
"title": "Item", |
|||
} |
|||
}, |
|||
"type": "object", |
|||
"required": ["item"], |
|||
"title": "Body_create_item_items__post", |
|||
}, |
|||
"HTTPValidationError": { |
|||
"properties": { |
|||
"detail": { |
|||
"items": { |
|||
"$ref": "#/components/schemas/ValidationError" |
|||
}, |
|||
"type": "array", |
|||
"title": "Detail", |
|||
} |
|||
}, |
|||
"type": "object", |
|||
"title": "HTTPValidationError", |
|||
}, |
|||
"Item": { |
|||
"properties": { |
|||
"name": {"type": "string", "title": "Name"}, |
|||
"description": {"type": "string", "title": "Description"}, |
|||
"size": {"type": "number", "title": "Size"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["name", "size"], |
|||
"title": "Item", |
|||
}, |
|||
"ValidationError": { |
|||
"properties": { |
|||
"loc": { |
|||
"items": { |
|||
"anyOf": [{"type": "string"}, {"type": "integer"}] |
|||
}, |
|||
"type": "array", |
|||
"title": "Location", |
|||
}, |
|||
"msg": {"type": "string", "title": "Message"}, |
|||
"type": {"type": "string", "title": "Error Type"}, |
|||
}, |
|||
"type": "object", |
|||
"required": ["loc", "msg", "type"], |
|||
"title": "ValidationError", |
|||
}, |
|||
} |
|||
}, |
|||
} |
|||
) |
Loading…
Reference in new issue