Browse Source
* ✨ Pydantic v2 migration, initial implementation (#9500) * ✨ Add compat layer, for Pydantic v1 and v2 * ✨ Re-export Pydantic needed internals from compat, to later patch them for v1 * ♻️ Refactor internals to use new compatibility layers and run with Pydantic v2 * 📝 Update examples to run with Pydantic v2 * ✅ Update tests to use Pydantic v2 * 🎨 [pre-commit.ci] Auto format from pre-commit.com hooks * ✅ Temporarily disable Peewee tests, afterwards I'll enable them only for Pydantic v1 * 🐛 Fix JSON Schema generation and OpenAPI ref template * 🐛 Fix model field creation with defaults from Pydantic v2 * 🐛 Fix body field creation, with new FieldInfo * ✨ Use and check new ResponseValidationError for server validation errors * ✅ Fix test_schema_extra_examples tests with ResponseValidationError * ✅ Add dirty-equals to tests for compatibility with Pydantic v1 and v2 * ✨ Add util to regenerate errors with custom loc * ✨ Generate validation errors with loc * ✅ Update tests for compatibility with Pydantic v1 and v2 * ✅ Update tests for Pydantic v2 in tests/test_filter_pydantic_sub_model.py * ✅ Refactor tests in tests/test_dependency_overrides.py for Pydantic v2, separate parameterized into independent tests to use insert_assert * ✅ Refactor OpenAPI test for tests/test_infer_param_optionality.py for consistency, and make it compatible with Pydantic v1 and v2 * ✅ Update tests for tests/test_multi_query_errors.py for Pydantic v1 and v2 * ✅ Update tests for tests/test_multi_body_errors.py for Pydantic v1 and v2 * ✅ Update tests for tests/test_multi_body_errors.py for Pydantic v1 and v2 * 🎨 [pre-commit.ci] Auto format from pre-commit.com hooks * ♻️ Refactor tests for tests/test_path.py to inline pytest parameters, to make it easier to make them compatible with Pydantic v2 * ✅ Refactor and udpate tests for tests/test_path.py for Pydantic v1 and v2 * ♻️ Refactor and update tests for tests/test_query.py with compatibility for Pydantic v1 and v2 * ✅ Fix test with optional field without default None * ✅ Update tests for compatibility with Pydantic v2 * ✅ Update tutorial tests for Pydantic v2 * ♻️ Update OAuth2 dependencies for Pydantic v2 * ♻️ Refactor str check when checking for sequence types * ♻️ Rename regex to pattern to keep in sync with Pydantic v2 * ♻️ Refactor _compat.py, start moving conditional imports and declarations to specifics of Pydantic v1 or v2 * ✅ Update tests for OAuth2 security optional * ✅ Refactor tests for OAuth2 optional for Pydantic v2 * ✅ Refactor tests for OAuth2 security for compatibility with Pydantic v2 * 🐛 Fix location in compat layer for Pydantic v2 ModelField * ✅ Refactor tests for Pydantic v2 in tests/test_tutorial/test_bigger_applications/test_main_an_py39.py * 🐛 Add missing markers in Python 3.9 tests * ✅ Refactor tests for bigger apps for consistency with annotated ones and with support for Pydantic v2 * 🐛 Fix jsonable_encoder with new Pydantic v2 data types and Url * 🐛 Fix invalid JSON error for compatibility with Pydantic v2 * ✅ Update tests for behind_a_proxy for Pydantic v2 * ✅ Update tests for tests/test_tutorial/test_body/test_tutorial001_py310.py for Pydantic v2 * ✅ Update tests for tests/test_tutorial/test_body/test_tutorial001.py with Pydantic v2 and consistency with Python 3.10 tests * ✅ Fix tests for tutorial/body_fields for Pydantic v2 * ✅ Refactor tests for tutorial/body_multiple_params with Pydantic v2 * ✅ Update tests for tutorial/body_nested_models for Pydantic v2 * ✅ Update tests for tutorial/body_updates for Pydantic v2 * ✅ Update test for tutorial/cookie_params for Pydantic v2 * ✅ Fix tests for tests/test_tutorial/test_custom_request_and_route/test_tutorial002.py for Pydantic v2 * ✅ Update tests for tutorial/dataclasses for Pydantic v2 * ✅ Update tests for tutorial/dependencies for Pydantic v2 * ✅ Update tests for tutorial/extra_data_types for Pydantic v2 * ✅ Update tests for tutorial/handling_errors for Pydantic v2 * ✅ Fix test markers for Python 3.9 * ✅ Update tests for tutorial/header_params for Pydantic v2 * ✅ Update tests for Pydantic v2 in tests/test_tutorial/test_openapi_callbacks/test_tutorial001.py * ✅ Fix extra tests for Pydantic v2 * ✅ Refactor test for parameters, to later fix Pydantic v2 * ✅ Update tests for tutorial/query_params for Pydantic v2 * ♻️ Update examples in docs to use new pattern instead of the old regex * ✅ Fix several tests for Pydantic v2 * ✅ Update and fix test for ResponseValidationError * 🐛 Fix check for sequences vs scalars, include bytes as scalar * 🐛 Fix check for complex data types, include UploadFile * 🐛 Add list to sequence annotation types * 🐛 Fix checks for uploads and add utils to find if an annotation is an upload (or bytes) * ✨ Add UnionType and NoneType to compat layer * ✅ Update tests for request_files for compatibility with Pydantic v2 and consistency with other tests * ✅ Fix testsw for request_forms for Pydantic v2 * ✅ Fix tests for request_forms_and_files for Pydantic v2 * ✅ Fix tests in tutorial/security for compatibility with Pydantic v2 * ⬆️ Upgrade required version of email_validator * ✅ Fix tests for params repr * ✅ Add Pydantic v2 pytest markers * Use match_pydantic_error_url * 🎨 [pre-commit.ci] Auto format from pre-commit.com hooks * Use field_serializer instead of encoders in some tests * Show Undefined as ... in repr * Mark custom encoders test with xfail * Update test to reflect new serialization of Decimal as str * Use `model_validate` instead of `from_orm` * Update JSON schema to reflect required nullable * Add dirty-equals to pyproject.toml * Fix locs and error creation for use with pydantic 2.0a4 * Use the type adapter for serialization. This is hacky. * 🎨 [pre-commit.ci] Auto format from pre-commit.com hooks * ✅ Refactor test_multi_body_errors for compatibility with Pydantic v1 and v2 * ✅ Refactor test_custom_encoder for Pydantic v1 and v2 * ✅ Set input to None for now, for compatibility with current tests * 🐛 Fix passing serialization params to model field when handling the response * ♻️ Refactor exceptions to not depend on Pydantic ValidationError class * ♻️ Revert/refactor params to simplify repr * ✅ Tweak tests for custom class encoders for Pydantic v1 and v2 * ✅ Tweak tests for jsonable_encoder for Pydantic v1 and v2 * ✅ Tweak test for compatibility with Pydantic v1 and v2 * 🐛 Fix filtering data with subclasses * 🐛 Workaround examples in OpenAPI schema * ✅ Add skip marker for SQL tutorial, needs to be updated either way * ✅ Update test for broken JSON * ✅ Fix test for broken JSON * ✅ Update tests for timedeltas * ✅ Fix test for plain text validation errors * ✅ Add markers for Pydantic v1 exclusive tests (for now) * ✅ Update test for path_params with enums for compatibility with Pydantic v1 and v2 * ✅ Update tests for extra examples in OpenAPI * ✅ Fix tests for response_model with compatibility with Pydantic v1 and v2 * 🐛 Fix required double serialization for different types of models * ✅ Fix tests for response model with compatibility with new Pydantic v2 * 🐛 Import Undefined from compat layer * ✅ Fix tests for response_model for Pydantic v2 * ✅ Fix tests for schema_extra for Pydantic v2 * ✅ Add markers and update tests for Pydantic v2 * 💡 Comment out logic for double encoding that breaks other usecases * ✅ Update errors for int parsing * ♻️ Refactor re-enabling compatibility for Pydantic v1 * ♻️ Refactor OpenAPI utils to re-enable support for Pydantic v1 * ♻️ Refactor dependencies/utils and _compat for compatibility with Pydantic v1 * 🐛 Fix and tweak compatibility with Pydantic v1 and v2 in dependencies/utils * ✅ Tweak tests and examples for Pydantic v1 * ♻️ Tweak call to ModelField.validate for compatibility with Pydantic v1 * ✨ Use new global override TypeAdapter from_attributes * ✅ Update tests after updating from_attributes * 🔧 Update pytest config to avoid collecting tests from docs, useful for editor-integrated tests * ✅ Add test for data filtering, including inheritance and models in fields or lists of models * ♻️ Make OpenAPI models compatible with both Pydantic v1 and v2 * ♻️ Fix compatibility for Pydantic v1 and v2 in jsonable_encoder * ♻️ Fix compatibility in params with Pydantic v1 and v2 * ♻️ Fix compatibility when creating a FieldInfo in Pydantic v1 and v2 in utils.py * ♻️ Fix generation of flat_models and JSON Schema definitions in _compat.py for Pydantic v1 and v2 * ♻️ Update handling of ErrorWrappers for Pydantic v1 * ♻️ Refactor checks and handling of types an sequences * ♻️ Refactor and cleanup comments with compatibility for Pydantic v1 and v2 * ♻️ Update UploadFile for compatibility with both Pydantic v1 and v2 * 🔥 Remove commented out unneeded code * 🐛 Fix mock of get_annotation_from_field_info for Pydantic v1 and v2 * 🐛 Fix params with compatibility for Pydantic v1 and v2, with schemas and new pattern vs regex * 🐛 Fix check if field is sequence for Pydantic v1 * ✅ Fix tests for custom_schema_fields, for compatibility with Pydantic v1 and v2 * ✅ Simplify and fix tests for jsonable_encoder with compatibility for Pydantic v1 and v2 * ✅ Fix tests for orm_mode with Pydantic v1 and compatibility with Pydantic v2 * ♻️ Refactor logic for normalizing Pydantic v1 ErrorWrappers * ♻️ Workaround for params with examples, before defining what to deprecate in Pydantic v1 and v2 for examples with JSON Schema vs OpenAPI * ✅ Fix tests for Pydantic v1 and v2 for response_by_alias * ✅ Fix test for schema_extra with compatibility with Pydantic v1 and v2 * ♻️ Tweak error regeneration with loc * ♻️ Update error handling and serializationwith compatibility for Pydantic v1 and v2 * ♻️ Re-enable custom encoders for Pydantic v1 * ♻️ Update ErrorWrapper reserialization in Pydantic v1, do it outside of FastAPI ValidationExceptions * ✅ Update test for filter_submodel, re-structure to simplify testing while keeping division of Pydantic v1 and v2 * ✅ Refactor Pydantic v1 only test that requires modifying environment variables * 🔥 Update test for plaintext error responses, for Pydantic v1 and v2 * ⏪️ Revert changes in DB tutorial to use Pydantic v1 (the new guide will have SQLModel) * ✅ Mark current SQL DB tutorial tests as Pydantic only * ♻️ Update datastructures for compatibility with Pydantic v1, not requiring pydantic-core * ♻️ Update encoders.py for compatibility with Pydantic v1 * ⏪️ Revert changes to Peewee, the docs for that are gonna live in a new HowTo section, not in the main tutorials * ♻️ Simplify response body kwargs generation * 🔥 Clean up comments * 🔥 Clean some tests and comments * ✅ Refactor tests to match new Pydantic error string URLs * ✅ Refactor tests for recursive models for Pydantic v1 and v2 * ✅ Update tests for Peewee, re-enable, Pydantic-v1-only * ♻️ Update FastAPI params to take regex and pattern arguments * ⏪️ Revert tutorial examples for pattern, it will be done in a subsequent PR * ⏪️ Revert changes in schema extra examples, it will be added later in a docs-specific PR * 💡 Add TODO comment to document str validations with pattern * 🔥 Remove unneeded comment * 📌 Upgrade Pydantic pin dependency * ⬆️ Upgrade email_validator dependency * 🐛 Tweak type annotations in _compat.py * 🔇 Tweak mypy errors for compat, for Pydantic v1 re-imports * 🐛 Tweak and fix type annotations * ➕ Update requirements-test.txt, re-add dirty-equals * 🔥 Remove unnecessary config * 🐛 Tweak type annotations * 🔥 Remove unnecessary type in dependencies/utils.py * 💡 Update comment in routing.py --------- Co-authored-by: David Montague <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * 👷 Add CI for both Pydantic v1 and v2 (#9688) * 👷 Test and install Pydantic v1 and v2 in CI * 💚 Tweak CI config for Pydantic v1 and v2 * 💚 Fix Pydantic v2 specification in CI * 🐛 Fix type annotations for compatibility with Python 3.7 * 💚 Install Pydantic v2 for lints * 🐛 Fix type annotations for Pydantic v2 * 💚 Re-use test cache for lint * ♻️ Refactor internals for test coverage and performance (#9691) * ♻️ Tweak import of Annotated from typing_extensions, they are installed anyway * ♻️ Refactor _compat to define functions for Pydantic v1 or v2 once instead of checking inside * ✅ Add test for UploadFile for Pydantic v2 * ♻️ Refactor types and remove logic for impossible cases * ✅ Add missing tests from test refactor for path params * ✅ Add tests for new decimal encoder * 💡 Add TODO comment for decimals in encoders * 🔥 Remove unneeded dummy function * 🔥 Remove section of code in field_annotation_is_scalar covered by sub-call to field_annotation_is_complex * ♻️ Refactor and tweak variables and types in _compat * ✅ Add tests for corner cases and compat with Pydantic v1 and v2 * ♻️ Refactor type annotations * 🔖 Release version 0.100.0-beta1 * ♻️ Refactor parts that use optional requirements to make them compatible with installations without them (#9707) * ♻️ Refactor parts that use optional requirements to make them compatible with installations without them * ♻️ Update JSON Schema for email field without email-validator installed * 🐛 Fix support for Pydantic v2.0, small changes in their final release (#9771) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Sebastián Ramírez <[email protected]> * 🔖 Release version 0.100.0-beta2 * ✨ OpenAPI 3.1.0 with Pydantic v2, merge `master` (#9773) * ➕ Add dirty-equals as a testing dependency (#9778) ➕ Add dirty-equals as a testing dependency, it seems it got lsot at some point * 🔀 Merge master, fix valid JSON Schema accepting bools (#9782) * ⏪️ Revert usage of custom logic for TypeAdapter JSON Schema, solved on the Pydantic side (#9787) ⏪️ Revert usage of custom logic for TypeAdapter JSON Schema, solved on Pydantic side * ♻️ Deprecate parameter `regex`, use `pattern` instead (#9786) * 📝 Update docs to deprecate regex, recommend pattern * ♻️ Update examples to use new pattern instead of regex * 📝 Add new example with deprecated regex * ♻️ Add deprecation notes and warnings for regex * ✅ Add tests for regex deprecation * ✅ Update tests for compatibility with Pydantic v1 * ✨ Update docs to use Pydantic v2 settings and add note and example about v1 (#9788) * ➕ Add pydantic-settings to all extras * 📝 Update docs for Pydantic settings * 📝 Update Settings source examples to use Pydantic v2, and add a Pydantic v1 version * ✅ Add tests for settings with Pydantic v1 and v2 * 🔥 Remove solved TODO comment * ♻️ Update conditional OpenAPI to use new Pydantic v2 settings * ✅ Update tests to import Annotated from typing_extensions for Python < 3.9 (#9795) * ➕ Add pydantic-extra-types to fastapi[extra] * ➕ temp: Install Pydantic from source to test JSON Schema metadata fixes (#9777) * ➕ Install Pydantic from source, from branch for JSON Schema with metadata * ➕ Update dependencies, install Pydantic main * ➕ Fix dependency URL for Pydantic from source * ➕ Add pydantic-settings for test requirements * 💡 Add TODO comments to re-enable Pydantic main (not from source) (#9796) * ✨ Add new Pydantic Field param options to Query, Cookie, Body, etc. (#9797) * 📝 Add docs for Pydantic v2 for `docs/en/docs/advanced/path-operation-advanced-configuration.md` (#9798) * 📝 Update docs in examples for settings with Pydantic v2 (#9799) * 📝 Update JSON Schema `examples` docs with Pydantic v2 (#9800) * ♻️ Use new Pydantic v2 JSON Schema generator (#9813) Co-authored-by: David Montague <[email protected]> * ♻️ Tweak type annotations and Pydantic version range (#9801) * 📌 Re-enable GA Pydantic, for v2, require minimum 2.0.2 (#9814) * 🔖 Release version 0.100.0-beta3 * 🔥 Remove duplicate type declaration from merge conflicts (#9832) * 👷♂️ Run tests with Pydantic v2 GA (#9830) 👷 Run tests for Pydantic v2 GA * 📝 Add notes to docs expecting Pydantic v2 and future updates (#9833) * 📝 Update index with new extras * 📝 Update release notes --------- Co-authored-by: David Montague <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: Pastukhov Nikita <[email protected]>pull/9817/head
committed by
GitHub
274 changed files with 16751 additions and 4601 deletions
@ -0,0 +1,34 @@ |
|||||
|
from typing import List |
||||
|
|
||||
|
import yaml |
||||
|
from fastapi import FastAPI, HTTPException, Request |
||||
|
from pydantic import BaseModel, ValidationError |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
class Item(BaseModel): |
||||
|
name: str |
||||
|
tags: List[str] |
||||
|
|
||||
|
|
||||
|
@app.post( |
||||
|
"/items/", |
||||
|
openapi_extra={ |
||||
|
"requestBody": { |
||||
|
"content": {"application/x-yaml": {"schema": Item.schema()}}, |
||||
|
"required": True, |
||||
|
}, |
||||
|
}, |
||||
|
) |
||||
|
async def create_item(request: Request): |
||||
|
raw_body = await request.body() |
||||
|
try: |
||||
|
data = yaml.safe_load(raw_body) |
||||
|
except yaml.YAMLError: |
||||
|
raise HTTPException(status_code=422, detail="Invalid YAML") |
||||
|
try: |
||||
|
item = Item.parse_obj(data) |
||||
|
except ValidationError as e: |
||||
|
raise HTTPException(status_code=422, detail=e.errors()) |
||||
|
return item |
@ -0,0 +1,17 @@ |
|||||
|
from typing import Annotated |
||||
|
|
||||
|
from fastapi import FastAPI, Query |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
@app.get("/items/") |
||||
|
async def read_items( |
||||
|
q: Annotated[ |
||||
|
str | None, Query(min_length=3, max_length=50, regex="^fixedquery$") |
||||
|
] = None |
||||
|
): |
||||
|
results = {"items": [{"item_id": "Foo"}, {"item_id": "Bar"}]} |
||||
|
if q: |
||||
|
results.update({"q": q}) |
||||
|
return results |
@ -0,0 +1,31 @@ |
|||||
|
from typing import Union |
||||
|
|
||||
|
from fastapi import FastAPI |
||||
|
from pydantic import BaseModel |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
class Item(BaseModel): |
||||
|
name: str |
||||
|
description: Union[str, None] = None |
||||
|
price: float |
||||
|
tax: Union[float, None] = None |
||||
|
|
||||
|
class Config: |
||||
|
schema_extra = { |
||||
|
"examples": [ |
||||
|
{ |
||||
|
"name": "Foo", |
||||
|
"description": "A very nice Item", |
||||
|
"price": 35.4, |
||||
|
"tax": 3.2, |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
|
||||
|
|
||||
|
@app.put("/items/{item_id}") |
||||
|
async def update_item(item_id: int, item: Item): |
||||
|
results = {"item_id": item_id, "item": item} |
||||
|
return results |
@ -0,0 +1,29 @@ |
|||||
|
from fastapi import FastAPI |
||||
|
from pydantic import BaseModel |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
class Item(BaseModel): |
||||
|
name: str |
||||
|
description: str | None = None |
||||
|
price: float |
||||
|
tax: float | None = None |
||||
|
|
||||
|
class Config: |
||||
|
schema_extra = { |
||||
|
"examples": [ |
||||
|
{ |
||||
|
"name": "Foo", |
||||
|
"description": "A very nice Item", |
||||
|
"price": 35.4, |
||||
|
"tax": 3.2, |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
|
||||
|
|
||||
|
@app.put("/items/{item_id}") |
||||
|
async def update_item(item_id: int, item: Item): |
||||
|
results = {"item_id": item_id, "item": item} |
||||
|
return results |
@ -0,0 +1,10 @@ |
|||||
|
from pydantic import BaseSettings |
||||
|
|
||||
|
|
||||
|
class Settings(BaseSettings): |
||||
|
app_name: str = "Awesome API" |
||||
|
admin_email: str |
||||
|
items_per_user: int = 50 |
||||
|
|
||||
|
class Config: |
||||
|
env_file = ".env" |
@ -0,0 +1,21 @@ |
|||||
|
from fastapi import FastAPI |
||||
|
from pydantic import BaseSettings |
||||
|
|
||||
|
|
||||
|
class Settings(BaseSettings): |
||||
|
app_name: str = "Awesome API" |
||||
|
admin_email: str |
||||
|
items_per_user: int = 50 |
||||
|
|
||||
|
|
||||
|
settings = Settings() |
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
@app.get("/info") |
||||
|
async def info(): |
||||
|
return { |
||||
|
"app_name": settings.app_name, |
||||
|
"admin_email": settings.admin_email, |
||||
|
"items_per_user": settings.items_per_user, |
||||
|
} |
@ -0,0 +1,616 @@ |
|||||
|
from collections import deque |
||||
|
from copy import copy |
||||
|
from dataclasses import dataclass, is_dataclass |
||||
|
from enum import Enum |
||||
|
from typing import ( |
||||
|
Any, |
||||
|
Callable, |
||||
|
Deque, |
||||
|
Dict, |
||||
|
FrozenSet, |
||||
|
List, |
||||
|
Mapping, |
||||
|
Sequence, |
||||
|
Set, |
||||
|
Tuple, |
||||
|
Type, |
||||
|
Union, |
||||
|
) |
||||
|
|
||||
|
from fastapi.exceptions import RequestErrorModel |
||||
|
from fastapi.types import IncEx, ModelNameMap, UnionType |
||||
|
from pydantic import BaseModel, create_model |
||||
|
from pydantic.version import VERSION as PYDANTIC_VERSION |
||||
|
from starlette.datastructures import UploadFile |
||||
|
from typing_extensions import Annotated, Literal, get_args, get_origin |
||||
|
|
||||
|
PYDANTIC_V2 = PYDANTIC_VERSION.startswith("2.") |
||||
|
|
||||
|
|
||||
|
sequence_annotation_to_type = { |
||||
|
Sequence: list, |
||||
|
List: list, |
||||
|
list: list, |
||||
|
Tuple: tuple, |
||||
|
tuple: tuple, |
||||
|
Set: set, |
||||
|
set: set, |
||||
|
FrozenSet: frozenset, |
||||
|
frozenset: frozenset, |
||||
|
Deque: deque, |
||||
|
deque: deque, |
||||
|
} |
||||
|
|
||||
|
sequence_types = tuple(sequence_annotation_to_type.keys()) |
||||
|
|
||||
|
if PYDANTIC_V2: |
||||
|
from pydantic import PydanticSchemaGenerationError as PydanticSchemaGenerationError |
||||
|
from pydantic import TypeAdapter |
||||
|
from pydantic import ValidationError as ValidationError |
||||
|
from pydantic._internal._schema_generation_shared import ( # type: ignore[attr-defined] |
||||
|
GetJsonSchemaHandler as GetJsonSchemaHandler, |
||||
|
) |
||||
|
from pydantic._internal._typing_extra import eval_type_lenient |
||||
|
from pydantic._internal._utils import lenient_issubclass as lenient_issubclass |
||||
|
from pydantic.fields import FieldInfo |
||||
|
from pydantic.json_schema import GenerateJsonSchema as GenerateJsonSchema |
||||
|
from pydantic.json_schema import JsonSchemaValue as JsonSchemaValue |
||||
|
from pydantic_core import CoreSchema as CoreSchema |
||||
|
from pydantic_core import MultiHostUrl as MultiHostUrl |
||||
|
from pydantic_core import PydanticUndefined, PydanticUndefinedType |
||||
|
from pydantic_core import Url as Url |
||||
|
from pydantic_core.core_schema import ( |
||||
|
general_plain_validator_function as general_plain_validator_function, |
||||
|
) |
||||
|
|
||||
|
Required = PydanticUndefined |
||||
|
Undefined = PydanticUndefined |
||||
|
UndefinedType = PydanticUndefinedType |
||||
|
evaluate_forwardref = eval_type_lenient |
||||
|
Validator = Any |
||||
|
|
||||
|
class BaseConfig: |
||||
|
pass |
||||
|
|
||||
|
class ErrorWrapper(Exception): |
||||
|
pass |
||||
|
|
||||
|
@dataclass |
||||
|
class ModelField: |
||||
|
field_info: FieldInfo |
||||
|
name: str |
||||
|
mode: Literal["validation", "serialization"] = "validation" |
||||
|
|
||||
|
@property |
||||
|
def alias(self) -> str: |
||||
|
a = self.field_info.alias |
||||
|
return a if a is not None else self.name |
||||
|
|
||||
|
@property |
||||
|
def required(self) -> bool: |
||||
|
return self.field_info.is_required() |
||||
|
|
||||
|
@property |
||||
|
def default(self) -> Any: |
||||
|
return self.get_default() |
||||
|
|
||||
|
@property |
||||
|
def type_(self) -> Any: |
||||
|
return self.field_info.annotation |
||||
|
|
||||
|
def __post_init__(self) -> None: |
||||
|
self._type_adapter: TypeAdapter[Any] = TypeAdapter( |
||||
|
Annotated[self.field_info.annotation, self.field_info] |
||||
|
) |
||||
|
|
||||
|
def get_default(self) -> Any: |
||||
|
if self.field_info.is_required(): |
||||
|
return Undefined |
||||
|
return self.field_info.get_default(call_default_factory=True) |
||||
|
|
||||
|
def validate( |
||||
|
self, |
||||
|
value: Any, |
||||
|
values: Dict[str, Any] = {}, # noqa: B006 |
||||
|
*, |
||||
|
loc: Tuple[Union[int, str], ...] = (), |
||||
|
) -> Tuple[Any, Union[List[Dict[str, Any]], None]]: |
||||
|
try: |
||||
|
return ( |
||||
|
self._type_adapter.validate_python(value, from_attributes=True), |
||||
|
None, |
||||
|
) |
||||
|
except ValidationError as exc: |
||||
|
return None, _regenerate_error_with_loc( |
||||
|
errors=exc.errors(), loc_prefix=loc |
||||
|
) |
||||
|
|
||||
|
def serialize( |
||||
|
self, |
||||
|
value: Any, |
||||
|
*, |
||||
|
mode: Literal["json", "python"] = "json", |
||||
|
include: Union[IncEx, None] = None, |
||||
|
exclude: Union[IncEx, None] = None, |
||||
|
by_alias: bool = True, |
||||
|
exclude_unset: bool = False, |
||||
|
exclude_defaults: bool = False, |
||||
|
exclude_none: bool = False, |
||||
|
) -> Any: |
||||
|
# What calls this code passes a value that already called |
||||
|
# self._type_adapter.validate_python(value) |
||||
|
return self._type_adapter.dump_python( |
||||
|
value, |
||||
|
mode=mode, |
||||
|
include=include, |
||||
|
exclude=exclude, |
||||
|
by_alias=by_alias, |
||||
|
exclude_unset=exclude_unset, |
||||
|
exclude_defaults=exclude_defaults, |
||||
|
exclude_none=exclude_none, |
||||
|
) |
||||
|
|
||||
|
def __hash__(self) -> int: |
||||
|
# Each ModelField is unique for our purposes, to allow making a dict from |
||||
|
# ModelField to its JSON Schema. |
||||
|
return id(self) |
||||
|
|
||||
|
def get_annotation_from_field_info( |
||||
|
annotation: Any, field_info: FieldInfo, field_name: str |
||||
|
) -> Any: |
||||
|
return annotation |
||||
|
|
||||
|
def _normalize_errors(errors: Sequence[Any]) -> List[Dict[str, Any]]: |
||||
|
return errors # type: ignore[return-value] |
||||
|
|
||||
|
def _model_rebuild(model: Type[BaseModel]) -> None: |
||||
|
model.model_rebuild() |
||||
|
|
||||
|
def _model_dump( |
||||
|
model: BaseModel, mode: Literal["json", "python"] = "json", **kwargs: Any |
||||
|
) -> Any: |
||||
|
return model.model_dump(mode=mode, **kwargs) |
||||
|
|
||||
|
def _get_model_config(model: BaseModel) -> Any: |
||||
|
return model.model_config |
||||
|
|
||||
|
def get_schema_from_model_field( |
||||
|
*, |
||||
|
field: ModelField, |
||||
|
schema_generator: GenerateJsonSchema, |
||||
|
model_name_map: ModelNameMap, |
||||
|
field_mapping: Dict[ |
||||
|
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
||||
|
], |
||||
|
) -> Dict[str, Any]: |
||||
|
# This expects that GenerateJsonSchema was already used to generate the definitions |
||||
|
json_schema = field_mapping[(field, field.mode)] |
||||
|
if "$ref" not in json_schema: |
||||
|
# TODO remove when deprecating Pydantic v1 |
||||
|
# Ref: https://github.com/pydantic/pydantic/blob/d61792cc42c80b13b23e3ffa74bc37ec7c77f7d1/pydantic/schema.py#L207 |
||||
|
json_schema[ |
||||
|
"title" |
||||
|
] = field.field_info.title or field.alias.title().replace("_", " ") |
||||
|
return json_schema |
||||
|
|
||||
|
def get_compat_model_name_map(fields: List[ModelField]) -> ModelNameMap: |
||||
|
return {} |
||||
|
|
||||
|
def get_definitions( |
||||
|
*, |
||||
|
fields: List[ModelField], |
||||
|
schema_generator: GenerateJsonSchema, |
||||
|
model_name_map: ModelNameMap, |
||||
|
) -> Tuple[ |
||||
|
Dict[ |
||||
|
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
||||
|
], |
||||
|
Dict[str, Dict[str, Any]], |
||||
|
]: |
||||
|
inputs = [ |
||||
|
(field, field.mode, field._type_adapter.core_schema) for field in fields |
||||
|
] |
||||
|
field_mapping, definitions = schema_generator.generate_definitions( |
||||
|
inputs=inputs |
||||
|
) |
||||
|
return field_mapping, definitions # type: ignore[return-value] |
||||
|
|
||||
|
def is_scalar_field(field: ModelField) -> bool: |
||||
|
from fastapi import params |
||||
|
|
||||
|
return field_annotation_is_scalar( |
||||
|
field.field_info.annotation |
||||
|
) and not isinstance(field.field_info, params.Body) |
||||
|
|
||||
|
def is_sequence_field(field: ModelField) -> bool: |
||||
|
return field_annotation_is_sequence(field.field_info.annotation) |
||||
|
|
||||
|
def is_scalar_sequence_field(field: ModelField) -> bool: |
||||
|
return field_annotation_is_scalar_sequence(field.field_info.annotation) |
||||
|
|
||||
|
def is_bytes_field(field: ModelField) -> bool: |
||||
|
return is_bytes_or_nonable_bytes_annotation(field.type_) |
||||
|
|
||||
|
def is_bytes_sequence_field(field: ModelField) -> bool: |
||||
|
return is_bytes_sequence_annotation(field.type_) |
||||
|
|
||||
|
def copy_field_info(*, field_info: FieldInfo, annotation: Any) -> FieldInfo: |
||||
|
return type(field_info).from_annotation(annotation) |
||||
|
|
||||
|
def serialize_sequence_value(*, field: ModelField, value: Any) -> Sequence[Any]: |
||||
|
origin_type = ( |
||||
|
get_origin(field.field_info.annotation) or field.field_info.annotation |
||||
|
) |
||||
|
assert issubclass(origin_type, sequence_types) # type: ignore[arg-type] |
||||
|
return sequence_annotation_to_type[origin_type](value) # type: ignore[no-any-return] |
||||
|
|
||||
|
def get_missing_field_error(loc: Tuple[str, ...]) -> Dict[str, Any]: |
||||
|
error = ValidationError.from_exception_data( |
||||
|
"Field required", [{"type": "missing", "loc": loc, "input": {}}] |
||||
|
).errors()[0] |
||||
|
error["input"] = None |
||||
|
return error # type: ignore[return-value] |
||||
|
|
||||
|
def create_body_model( |
||||
|
*, fields: Sequence[ModelField], model_name: str |
||||
|
) -> Type[BaseModel]: |
||||
|
field_params = {f.name: (f.field_info.annotation, f.field_info) for f in fields} |
||||
|
BodyModel: Type[BaseModel] = create_model(model_name, **field_params) # type: ignore[call-overload] |
||||
|
return BodyModel |
||||
|
|
||||
|
else: |
||||
|
from fastapi.openapi.constants import REF_PREFIX as REF_PREFIX |
||||
|
from pydantic import AnyUrl as Url # noqa: F401 |
||||
|
from pydantic import ( # type: ignore[assignment] |
||||
|
BaseConfig as BaseConfig, # noqa: F401 |
||||
|
) |
||||
|
from pydantic import ValidationError as ValidationError # noqa: F401 |
||||
|
from pydantic.class_validators import ( # type: ignore[no-redef] |
||||
|
Validator as Validator, # noqa: F401 |
||||
|
) |
||||
|
from pydantic.error_wrappers import ( # type: ignore[no-redef] |
||||
|
ErrorWrapper as ErrorWrapper, # noqa: F401 |
||||
|
) |
||||
|
from pydantic.errors import MissingError |
||||
|
from pydantic.fields import ( # type: ignore[attr-defined] |
||||
|
SHAPE_FROZENSET, |
||||
|
SHAPE_LIST, |
||||
|
SHAPE_SEQUENCE, |
||||
|
SHAPE_SET, |
||||
|
SHAPE_SINGLETON, |
||||
|
SHAPE_TUPLE, |
||||
|
SHAPE_TUPLE_ELLIPSIS, |
||||
|
) |
||||
|
from pydantic.fields import FieldInfo as FieldInfo |
||||
|
from pydantic.fields import ( # type: ignore[no-redef,attr-defined] |
||||
|
ModelField as ModelField, # noqa: F401 |
||||
|
) |
||||
|
from pydantic.fields import ( # type: ignore[no-redef,attr-defined] |
||||
|
Required as Required, # noqa: F401 |
||||
|
) |
||||
|
from pydantic.fields import ( # type: ignore[no-redef,attr-defined] |
||||
|
Undefined as Undefined, |
||||
|
) |
||||
|
from pydantic.fields import ( # type: ignore[no-redef, attr-defined] |
||||
|
UndefinedType as UndefinedType, # noqa: F401 |
||||
|
) |
||||
|
from pydantic.networks import ( # type: ignore[no-redef] |
||||
|
MultiHostDsn as MultiHostUrl, # noqa: F401 |
||||
|
) |
||||
|
from pydantic.schema import ( |
||||
|
field_schema, |
||||
|
get_flat_models_from_fields, |
||||
|
get_model_name_map, |
||||
|
model_process_schema, |
||||
|
) |
||||
|
from pydantic.schema import ( # type: ignore[no-redef] # noqa: F401 |
||||
|
get_annotation_from_field_info as get_annotation_from_field_info, |
||||
|
) |
||||
|
from pydantic.typing import ( # type: ignore[no-redef] |
||||
|
evaluate_forwardref as evaluate_forwardref, # noqa: F401 |
||||
|
) |
||||
|
from pydantic.utils import ( # type: ignore[no-redef] |
||||
|
lenient_issubclass as lenient_issubclass, # noqa: F401 |
||||
|
) |
||||
|
|
||||
|
GetJsonSchemaHandler = Any # type: ignore[assignment,misc] |
||||
|
JsonSchemaValue = Dict[str, Any] # type: ignore[misc] |
||||
|
CoreSchema = Any # type: ignore[assignment,misc] |
||||
|
|
||||
|
sequence_shapes = { |
||||
|
SHAPE_LIST, |
||||
|
SHAPE_SET, |
||||
|
SHAPE_FROZENSET, |
||||
|
SHAPE_TUPLE, |
||||
|
SHAPE_SEQUENCE, |
||||
|
SHAPE_TUPLE_ELLIPSIS, |
||||
|
} |
||||
|
sequence_shape_to_type = { |
||||
|
SHAPE_LIST: list, |
||||
|
SHAPE_SET: set, |
||||
|
SHAPE_TUPLE: tuple, |
||||
|
SHAPE_SEQUENCE: list, |
||||
|
SHAPE_TUPLE_ELLIPSIS: list, |
||||
|
} |
||||
|
|
||||
|
@dataclass |
||||
|
class GenerateJsonSchema: # type: ignore[no-redef] |
||||
|
ref_template: str |
||||
|
|
||||
|
class PydanticSchemaGenerationError(Exception): # type: ignore[no-redef] |
||||
|
pass |
||||
|
|
||||
|
def general_plain_validator_function( # type: ignore[misc] |
||||
|
function: Callable[..., Any], |
||||
|
*, |
||||
|
ref: Union[str, None] = None, |
||||
|
metadata: Any = None, |
||||
|
serialization: Any = None, |
||||
|
) -> Any: |
||||
|
return {} |
||||
|
|
||||
|
def get_model_definitions( |
||||
|
*, |
||||
|
flat_models: Set[Union[Type[BaseModel], Type[Enum]]], |
||||
|
model_name_map: Dict[Union[Type[BaseModel], Type[Enum]], str], |
||||
|
) -> Dict[str, Any]: |
||||
|
definitions: Dict[str, Dict[str, Any]] = {} |
||||
|
for model in flat_models: |
||||
|
m_schema, m_definitions, m_nested_models = model_process_schema( |
||||
|
model, model_name_map=model_name_map, ref_prefix=REF_PREFIX |
||||
|
) |
||||
|
definitions.update(m_definitions) |
||||
|
model_name = model_name_map[model] |
||||
|
if "description" in m_schema: |
||||
|
m_schema["description"] = m_schema["description"].split("\f")[0] |
||||
|
definitions[model_name] = m_schema |
||||
|
return definitions |
||||
|
|
||||
|
def is_pv1_scalar_field(field: ModelField) -> bool: |
||||
|
from fastapi import params |
||||
|
|
||||
|
field_info = field.field_info |
||||
|
if not ( |
||||
|
field.shape == SHAPE_SINGLETON # type: ignore[attr-defined] |
||||
|
and not lenient_issubclass(field.type_, BaseModel) |
||||
|
and not lenient_issubclass(field.type_, dict) |
||||
|
and not field_annotation_is_sequence(field.type_) |
||||
|
and not is_dataclass(field.type_) |
||||
|
and not isinstance(field_info, params.Body) |
||||
|
): |
||||
|
return False |
||||
|
if field.sub_fields: # type: ignore[attr-defined] |
||||
|
if not all( |
||||
|
is_pv1_scalar_field(f) |
||||
|
for f in field.sub_fields # type: ignore[attr-defined] |
||||
|
): |
||||
|
return False |
||||
|
return True |
||||
|
|
||||
|
def is_pv1_scalar_sequence_field(field: ModelField) -> bool: |
||||
|
if (field.shape in sequence_shapes) and not lenient_issubclass( # type: ignore[attr-defined] |
||||
|
field.type_, BaseModel |
||||
|
): |
||||
|
if field.sub_fields is not None: # type: ignore[attr-defined] |
||||
|
for sub_field in field.sub_fields: # type: ignore[attr-defined] |
||||
|
if not is_pv1_scalar_field(sub_field): |
||||
|
return False |
||||
|
return True |
||||
|
if _annotation_is_sequence(field.type_): |
||||
|
return True |
||||
|
return False |
||||
|
|
||||
|
def _normalize_errors(errors: Sequence[Any]) -> List[Dict[str, Any]]: |
||||
|
use_errors: List[Any] = [] |
||||
|
for error in errors: |
||||
|
if isinstance(error, ErrorWrapper): |
||||
|
new_errors = ValidationError( # type: ignore[call-arg] |
||||
|
errors=[error], model=RequestErrorModel |
||||
|
).errors() |
||||
|
use_errors.extend(new_errors) |
||||
|
elif isinstance(error, list): |
||||
|
use_errors.extend(_normalize_errors(error)) |
||||
|
else: |
||||
|
use_errors.append(error) |
||||
|
return use_errors |
||||
|
|
||||
|
def _model_rebuild(model: Type[BaseModel]) -> None: |
||||
|
model.update_forward_refs() |
||||
|
|
||||
|
def _model_dump( |
||||
|
model: BaseModel, mode: Literal["json", "python"] = "json", **kwargs: Any |
||||
|
) -> Any: |
||||
|
return model.dict(**kwargs) |
||||
|
|
||||
|
def _get_model_config(model: BaseModel) -> Any: |
||||
|
return model.__config__ # type: ignore[attr-defined] |
||||
|
|
||||
|
def get_schema_from_model_field( |
||||
|
*, |
||||
|
field: ModelField, |
||||
|
schema_generator: GenerateJsonSchema, |
||||
|
model_name_map: ModelNameMap, |
||||
|
field_mapping: Dict[ |
||||
|
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
||||
|
], |
||||
|
) -> Dict[str, Any]: |
||||
|
# This expects that GenerateJsonSchema was already used to generate the definitions |
||||
|
return field_schema( # type: ignore[no-any-return] |
||||
|
field, model_name_map=model_name_map, ref_prefix=REF_PREFIX |
||||
|
)[0] |
||||
|
|
||||
|
def get_compat_model_name_map(fields: List[ModelField]) -> ModelNameMap: |
||||
|
models = get_flat_models_from_fields(fields, known_models=set()) |
||||
|
return get_model_name_map(models) # type: ignore[no-any-return] |
||||
|
|
||||
|
def get_definitions( |
||||
|
*, |
||||
|
fields: List[ModelField], |
||||
|
schema_generator: GenerateJsonSchema, |
||||
|
model_name_map: ModelNameMap, |
||||
|
) -> Tuple[ |
||||
|
Dict[ |
||||
|
Tuple[ModelField, Literal["validation", "serialization"]], JsonSchemaValue |
||||
|
], |
||||
|
Dict[str, Dict[str, Any]], |
||||
|
]: |
||||
|
models = get_flat_models_from_fields(fields, known_models=set()) |
||||
|
return {}, get_model_definitions( |
||||
|
flat_models=models, model_name_map=model_name_map |
||||
|
) |
||||
|
|
||||
|
def is_scalar_field(field: ModelField) -> bool: |
||||
|
return is_pv1_scalar_field(field) |
||||
|
|
||||
|
def is_sequence_field(field: ModelField) -> bool: |
||||
|
return field.shape in sequence_shapes or _annotation_is_sequence(field.type_) # type: ignore[attr-defined] |
||||
|
|
||||
|
def is_scalar_sequence_field(field: ModelField) -> bool: |
||||
|
return is_pv1_scalar_sequence_field(field) |
||||
|
|
||||
|
def is_bytes_field(field: ModelField) -> bool: |
||||
|
return lenient_issubclass(field.type_, bytes) |
||||
|
|
||||
|
def is_bytes_sequence_field(field: ModelField) -> bool: |
||||
|
return field.shape in sequence_shapes and lenient_issubclass(field.type_, bytes) # type: ignore[attr-defined] |
||||
|
|
||||
|
def copy_field_info(*, field_info: FieldInfo, annotation: Any) -> FieldInfo: |
||||
|
return copy(field_info) |
||||
|
|
||||
|
def serialize_sequence_value(*, field: ModelField, value: Any) -> Sequence[Any]: |
||||
|
return sequence_shape_to_type[field.shape](value) # type: ignore[no-any-return,attr-defined] |
||||
|
|
||||
|
def get_missing_field_error(loc: Tuple[str, ...]) -> Dict[str, Any]: |
||||
|
missing_field_error = ErrorWrapper(MissingError(), loc=loc) # type: ignore[call-arg] |
||||
|
new_error = ValidationError([missing_field_error], RequestErrorModel) |
||||
|
return new_error.errors()[0] # type: ignore[return-value] |
||||
|
|
||||
|
def create_body_model( |
||||
|
*, fields: Sequence[ModelField], model_name: str |
||||
|
) -> Type[BaseModel]: |
||||
|
BodyModel = create_model(model_name) |
||||
|
for f in fields: |
||||
|
BodyModel.__fields__[f.name] = f # type: ignore[index] |
||||
|
return BodyModel |
||||
|
|
||||
|
|
||||
|
def _regenerate_error_with_loc( |
||||
|
*, errors: Sequence[Any], loc_prefix: Tuple[Union[str, int], ...] |
||||
|
) -> List[Dict[str, Any]]: |
||||
|
updated_loc_errors: List[Any] = [ |
||||
|
{**err, "loc": loc_prefix + err.get("loc", ())} |
||||
|
for err in _normalize_errors(errors) |
||||
|
] |
||||
|
|
||||
|
return updated_loc_errors |
||||
|
|
||||
|
|
||||
|
def _annotation_is_sequence(annotation: Union[Type[Any], None]) -> bool: |
||||
|
if lenient_issubclass(annotation, (str, bytes)): |
||||
|
return False |
||||
|
return lenient_issubclass(annotation, sequence_types) |
||||
|
|
||||
|
|
||||
|
def field_annotation_is_sequence(annotation: Union[Type[Any], None]) -> bool: |
||||
|
return _annotation_is_sequence(annotation) or _annotation_is_sequence( |
||||
|
get_origin(annotation) |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def value_is_sequence(value: Any) -> bool: |
||||
|
return isinstance(value, sequence_types) and not isinstance(value, (str, bytes)) # type: ignore[arg-type] |
||||
|
|
||||
|
|
||||
|
def _annotation_is_complex(annotation: Union[Type[Any], None]) -> bool: |
||||
|
return ( |
||||
|
lenient_issubclass(annotation, (BaseModel, Mapping, UploadFile)) |
||||
|
or _annotation_is_sequence(annotation) |
||||
|
or is_dataclass(annotation) |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def field_annotation_is_complex(annotation: Union[Type[Any], None]) -> bool: |
||||
|
origin = get_origin(annotation) |
||||
|
if origin is Union or origin is UnionType: |
||||
|
return any(field_annotation_is_complex(arg) for arg in get_args(annotation)) |
||||
|
|
||||
|
return ( |
||||
|
_annotation_is_complex(annotation) |
||||
|
or _annotation_is_complex(origin) |
||||
|
or hasattr(origin, "__pydantic_core_schema__") |
||||
|
or hasattr(origin, "__get_pydantic_core_schema__") |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def field_annotation_is_scalar(annotation: Any) -> bool: |
||||
|
# handle Ellipsis here to make tuple[int, ...] work nicely |
||||
|
return annotation is Ellipsis or not field_annotation_is_complex(annotation) |
||||
|
|
||||
|
|
||||
|
def field_annotation_is_scalar_sequence(annotation: Union[Type[Any], None]) -> bool: |
||||
|
origin = get_origin(annotation) |
||||
|
if origin is Union or origin is UnionType: |
||||
|
at_least_one_scalar_sequence = False |
||||
|
for arg in get_args(annotation): |
||||
|
if field_annotation_is_scalar_sequence(arg): |
||||
|
at_least_one_scalar_sequence = True |
||||
|
continue |
||||
|
elif not field_annotation_is_scalar(arg): |
||||
|
return False |
||||
|
return at_least_one_scalar_sequence |
||||
|
return field_annotation_is_sequence(annotation) and all( |
||||
|
field_annotation_is_scalar(sub_annotation) |
||||
|
for sub_annotation in get_args(annotation) |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def is_bytes_or_nonable_bytes_annotation(annotation: Any) -> bool: |
||||
|
if lenient_issubclass(annotation, bytes): |
||||
|
return True |
||||
|
origin = get_origin(annotation) |
||||
|
if origin is Union or origin is UnionType: |
||||
|
for arg in get_args(annotation): |
||||
|
if lenient_issubclass(arg, bytes): |
||||
|
return True |
||||
|
return False |
||||
|
|
||||
|
|
||||
|
def is_uploadfile_or_nonable_uploadfile_annotation(annotation: Any) -> bool: |
||||
|
if lenient_issubclass(annotation, UploadFile): |
||||
|
return True |
||||
|
origin = get_origin(annotation) |
||||
|
if origin is Union or origin is UnionType: |
||||
|
for arg in get_args(annotation): |
||||
|
if lenient_issubclass(arg, UploadFile): |
||||
|
return True |
||||
|
return False |
||||
|
|
||||
|
|
||||
|
def is_bytes_sequence_annotation(annotation: Any) -> bool: |
||||
|
origin = get_origin(annotation) |
||||
|
if origin is Union or origin is UnionType: |
||||
|
at_least_one = False |
||||
|
for arg in get_args(annotation): |
||||
|
if is_bytes_sequence_annotation(arg): |
||||
|
at_least_one = True |
||||
|
continue |
||||
|
return at_least_one |
||||
|
return field_annotation_is_sequence(annotation) and all( |
||||
|
is_bytes_or_nonable_bytes_annotation(sub_annotation) |
||||
|
for sub_annotation in get_args(annotation) |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def is_uploadfile_sequence_annotation(annotation: Any) -> bool: |
||||
|
origin = get_origin(annotation) |
||||
|
if origin is Union or origin is UnionType: |
||||
|
at_least_one = False |
||||
|
for arg in get_args(annotation): |
||||
|
if is_uploadfile_sequence_annotation(arg): |
||||
|
at_least_one = True |
||||
|
continue |
||||
|
return at_least_one |
||||
|
return field_annotation_is_sequence(annotation) and all( |
||||
|
is_uploadfile_or_nonable_uploadfile_annotation(sub_annotation) |
||||
|
for sub_annotation in get_args(annotation) |
||||
|
) |
@ -1,2 +1,3 @@ |
|||||
METHODS_WITH_BODY = {"GET", "HEAD", "POST", "PUT", "DELETE", "PATCH"} |
METHODS_WITH_BODY = {"GET", "HEAD", "POST", "PUT", "DELETE", "PATCH"} |
||||
REF_PREFIX = "#/components/schemas/" |
REF_PREFIX = "#/components/schemas/" |
||||
|
REF_TEMPLATE = "#/components/schemas/{model}" |
||||
|
@ -1,3 +1,11 @@ |
|||||
from typing import Any, Callable, TypeVar |
import types |
||||
|
from enum import Enum |
||||
|
from typing import Any, Callable, Dict, Set, Type, TypeVar, Union |
||||
|
|
||||
|
from pydantic import BaseModel |
||||
|
|
||||
DecoratedCallable = TypeVar("DecoratedCallable", bound=Callable[..., Any]) |
DecoratedCallable = TypeVar("DecoratedCallable", bound=Callable[..., Any]) |
||||
|
UnionType = getattr(types, "UnionType", Union) |
||||
|
NoneType = getattr(types, "UnionType", None) |
||||
|
ModelNameMap = Dict[Union[Type[BaseModel], Type[Enum]], str] |
||||
|
IncEx = Union[Set[int], Set[str], Dict[int, Any], Dict[str, Any]] |
||||
|
@ -0,0 +1,93 @@ |
|||||
|
from typing import List, Union |
||||
|
|
||||
|
from fastapi import FastAPI, UploadFile |
||||
|
from fastapi._compat import ( |
||||
|
ModelField, |
||||
|
Undefined, |
||||
|
_get_model_config, |
||||
|
is_bytes_sequence_annotation, |
||||
|
is_uploadfile_sequence_annotation, |
||||
|
) |
||||
|
from fastapi.testclient import TestClient |
||||
|
from pydantic import BaseConfig, BaseModel, ConfigDict |
||||
|
from pydantic.fields import FieldInfo |
||||
|
|
||||
|
from .utils import needs_pydanticv1, needs_pydanticv2 |
||||
|
|
||||
|
|
||||
|
@needs_pydanticv2 |
||||
|
def test_model_field_default_required(): |
||||
|
# For coverage |
||||
|
field_info = FieldInfo(annotation=str) |
||||
|
field = ModelField(name="foo", field_info=field_info) |
||||
|
assert field.default is Undefined |
||||
|
|
||||
|
|
||||
|
@needs_pydanticv1 |
||||
|
def test_upload_file_dummy_general_plain_validator_function(): |
||||
|
# For coverage |
||||
|
assert UploadFile.__get_pydantic_core_schema__(str, lambda x: None) == {} |
||||
|
|
||||
|
|
||||
|
@needs_pydanticv1 |
||||
|
def test_union_scalar_list(): |
||||
|
# For coverage |
||||
|
# TODO: there might not be a current valid code path that uses this, it would |
||||
|
# potentially enable query parameters defined as both a scalar and a list |
||||
|
# but that would require more refactors, also not sure it's really useful |
||||
|
from fastapi._compat import is_pv1_scalar_field |
||||
|
|
||||
|
field_info = FieldInfo() |
||||
|
field = ModelField( |
||||
|
name="foo", |
||||
|
field_info=field_info, |
||||
|
type_=Union[str, List[int]], |
||||
|
class_validators={}, |
||||
|
model_config=BaseConfig, |
||||
|
) |
||||
|
assert not is_pv1_scalar_field(field) |
||||
|
|
||||
|
|
||||
|
@needs_pydanticv2 |
||||
|
def test_get_model_config(): |
||||
|
# For coverage in Pydantic v2 |
||||
|
class Foo(BaseModel): |
||||
|
model_config = ConfigDict(from_attributes=True) |
||||
|
|
||||
|
foo = Foo() |
||||
|
config = _get_model_config(foo) |
||||
|
assert config == {"from_attributes": True} |
||||
|
|
||||
|
|
||||
|
def test_complex(): |
||||
|
app = FastAPI() |
||||
|
|
||||
|
@app.post("/") |
||||
|
def foo(foo: Union[str, List[int]]): |
||||
|
return foo |
||||
|
|
||||
|
client = TestClient(app) |
||||
|
|
||||
|
response = client.post("/", json="bar") |
||||
|
assert response.status_code == 200, response.text |
||||
|
assert response.json() == "bar" |
||||
|
|
||||
|
response2 = client.post("/", json=[1, 2]) |
||||
|
assert response2.status_code == 200, response2.text |
||||
|
assert response2.json() == [1, 2] |
||||
|
|
||||
|
|
||||
|
def test_is_bytes_sequence_annotation_union(): |
||||
|
# For coverage |
||||
|
# TODO: in theory this would allow declaring types that could be lists of bytes |
||||
|
# to be read from files and other types, but I'm not even sure it's a good idea |
||||
|
# to support it as a first class "feature" |
||||
|
assert is_bytes_sequence_annotation(Union[List[str], List[bytes]]) |
||||
|
|
||||
|
|
||||
|
def test_is_uploadfile_sequence_annotation(): |
||||
|
# For coverage |
||||
|
# TODO: in theory this would allow declaring types that could be lists of UploadFile |
||||
|
# and other types, but I'm not even sure it's a good idea to support it as a first |
||||
|
# class "feature" |
||||
|
assert is_uploadfile_sequence_annotation(Union[List[str], List[UploadFile]]) |
@ -0,0 +1,35 @@ |
|||||
|
from typing import Optional |
||||
|
|
||||
|
from fastapi import Depends, FastAPI |
||||
|
from pydantic import BaseModel, validator |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
class ModelB(BaseModel): |
||||
|
username: str |
||||
|
|
||||
|
|
||||
|
class ModelC(ModelB): |
||||
|
password: str |
||||
|
|
||||
|
|
||||
|
class ModelA(BaseModel): |
||||
|
name: str |
||||
|
description: Optional[str] = None |
||||
|
model_b: ModelB |
||||
|
|
||||
|
@validator("name") |
||||
|
def lower_username(cls, name: str, values): |
||||
|
if not name.endswith("A"): |
||||
|
raise ValueError("name must end in A") |
||||
|
return name |
||||
|
|
||||
|
|
||||
|
async def get_model_c() -> ModelC: |
||||
|
return ModelC(username="test-user", password="test-password") |
||||
|
|
||||
|
|
||||
|
@app.get("/model/{name}", response_model=ModelA) |
||||
|
async def get_model_a(name: str, model_c=Depends(get_model_c)): |
||||
|
return {"name": name, "description": "model-a-desc", "model_b": model_c} |
@ -0,0 +1,182 @@ |
|||||
|
from typing import Optional |
||||
|
|
||||
|
import pytest |
||||
|
from dirty_equals import IsDict |
||||
|
from fastapi import Depends, FastAPI |
||||
|
from fastapi.exceptions import ResponseValidationError |
||||
|
from fastapi.testclient import TestClient |
||||
|
from fastapi.utils import match_pydantic_error_url |
||||
|
|
||||
|
from .utils import needs_pydanticv2 |
||||
|
|
||||
|
|
||||
|
@pytest.fixture(name="client") |
||||
|
def get_client(): |
||||
|
from pydantic import BaseModel, FieldValidationInfo, field_validator |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
class ModelB(BaseModel): |
||||
|
username: str |
||||
|
|
||||
|
class ModelC(ModelB): |
||||
|
password: str |
||||
|
|
||||
|
class ModelA(BaseModel): |
||||
|
name: str |
||||
|
description: Optional[str] = None |
||||
|
foo: ModelB |
||||
|
|
||||
|
@field_validator("name") |
||||
|
def lower_username(cls, name: str, info: FieldValidationInfo): |
||||
|
if not name.endswith("A"): |
||||
|
raise ValueError("name must end in A") |
||||
|
return name |
||||
|
|
||||
|
async def get_model_c() -> ModelC: |
||||
|
return ModelC(username="test-user", password="test-password") |
||||
|
|
||||
|
@app.get("/model/{name}", response_model=ModelA) |
||||
|
async def get_model_a(name: str, model_c=Depends(get_model_c)): |
||||
|
return {"name": name, "description": "model-a-desc", "foo": model_c} |
||||
|
|
||||
|
client = TestClient(app) |
||||
|
return client |
||||
|
|
||||
|
|
||||
|
@needs_pydanticv2 |
||||
|
def test_filter_sub_model(client: TestClient): |
||||
|
response = client.get("/model/modelA") |
||||
|
assert response.status_code == 200, response.text |
||||
|
assert response.json() == { |
||||
|
"name": "modelA", |
||||
|
"description": "model-a-desc", |
||||
|
"foo": {"username": "test-user"}, |
||||
|
} |
||||
|
|
||||
|
|
||||
|
@needs_pydanticv2 |
||||
|
def test_validator_is_cloned(client: TestClient): |
||||
|
with pytest.raises(ResponseValidationError) as err: |
||||
|
client.get("/model/modelX") |
||||
|
assert err.value.errors() == [ |
||||
|
IsDict( |
||||
|
{ |
||||
|
"type": "value_error", |
||||
|
"loc": ("response", "name"), |
||||
|
"msg": "Value error, name must end in A", |
||||
|
"input": "modelX", |
||||
|
"ctx": {"error": "name must end in A"}, |
||||
|
"url": match_pydantic_error_url("value_error"), |
||||
|
} |
||||
|
) |
||||
|
| IsDict( |
||||
|
# TODO remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"loc": ("response", "name"), |
||||
|
"msg": "name must end in A", |
||||
|
"type": "value_error", |
||||
|
} |
||||
|
) |
||||
|
] |
||||
|
|
||||
|
|
||||
|
@needs_pydanticv2 |
||||
|
def test_openapi_schema(client: TestClient): |
||||
|
response = client.get("/openapi.json") |
||||
|
assert response.status_code == 200, response.text |
||||
|
assert response.json() == { |
||||
|
"openapi": "3.1.0", |
||||
|
"info": {"title": "FastAPI", "version": "0.1.0"}, |
||||
|
"paths": { |
||||
|
"/model/{name}": { |
||||
|
"get": { |
||||
|
"summary": "Get Model A", |
||||
|
"operationId": "get_model_a_model__name__get", |
||||
|
"parameters": [ |
||||
|
{ |
||||
|
"required": True, |
||||
|
"schema": {"title": "Name", "type": "string"}, |
||||
|
"name": "name", |
||||
|
"in": "path", |
||||
|
} |
||||
|
], |
||||
|
"responses": { |
||||
|
"200": { |
||||
|
"description": "Successful Response", |
||||
|
"content": { |
||||
|
"application/json": { |
||||
|
"schema": {"$ref": "#/components/schemas/ModelA"} |
||||
|
} |
||||
|
}, |
||||
|
}, |
||||
|
"422": { |
||||
|
"description": "Validation Error", |
||||
|
"content": { |
||||
|
"application/json": { |
||||
|
"schema": { |
||||
|
"$ref": "#/components/schemas/HTTPValidationError" |
||||
|
} |
||||
|
} |
||||
|
}, |
||||
|
}, |
||||
|
}, |
||||
|
} |
||||
|
} |
||||
|
}, |
||||
|
"components": { |
||||
|
"schemas": { |
||||
|
"HTTPValidationError": { |
||||
|
"title": "HTTPValidationError", |
||||
|
"type": "object", |
||||
|
"properties": { |
||||
|
"detail": { |
||||
|
"title": "Detail", |
||||
|
"type": "array", |
||||
|
"items": {"$ref": "#/components/schemas/ValidationError"}, |
||||
|
} |
||||
|
}, |
||||
|
}, |
||||
|
"ModelA": { |
||||
|
"title": "ModelA", |
||||
|
"required": ["name", "foo"], |
||||
|
"type": "object", |
||||
|
"properties": { |
||||
|
"name": {"title": "Name", "type": "string"}, |
||||
|
"description": IsDict( |
||||
|
{ |
||||
|
"title": "Description", |
||||
|
"anyOf": [{"type": "string"}, {"type": "null"}], |
||||
|
} |
||||
|
) |
||||
|
| |
||||
|
# TODO remove when deprecating Pydantic v1 |
||||
|
IsDict({"title": "Description", "type": "string"}), |
||||
|
"foo": {"$ref": "#/components/schemas/ModelB"}, |
||||
|
}, |
||||
|
}, |
||||
|
"ModelB": { |
||||
|
"title": "ModelB", |
||||
|
"required": ["username"], |
||||
|
"type": "object", |
||||
|
"properties": {"username": {"title": "Username", "type": "string"}}, |
||||
|
}, |
||||
|
"ValidationError": { |
||||
|
"title": "ValidationError", |
||||
|
"required": ["loc", "msg", "type"], |
||||
|
"type": "object", |
||||
|
"properties": { |
||||
|
"loc": { |
||||
|
"title": "Location", |
||||
|
"type": "array", |
||||
|
"items": { |
||||
|
"anyOf": [{"type": "string"}, {"type": "integer"}] |
||||
|
}, |
||||
|
}, |
||||
|
"msg": {"title": "Message", "type": "string"}, |
||||
|
"type": {"title": "Error Type", "type": "string"}, |
||||
|
}, |
||||
|
}, |
||||
|
} |
||||
|
}, |
||||
|
} |
File diff suppressed because it is too large
@ -1,62 +1,410 @@ |
|||||
import pytest |
from dirty_equals import IsDict |
||||
from fastapi.testclient import TestClient |
from fastapi.testclient import TestClient |
||||
|
from fastapi.utils import match_pydantic_error_url |
||||
|
|
||||
from .main import app |
from .main import app |
||||
|
|
||||
client = TestClient(app) |
client = TestClient(app) |
||||
|
|
||||
response_missing = { |
|
||||
"detail": [ |
def test_query(): |
||||
{ |
response = client.get("/query") |
||||
"loc": ["query", "query"], |
assert response.status_code == 422 |
||||
"msg": "field required", |
assert response.json() == IsDict( |
||||
"type": "value_error.missing", |
{ |
||||
} |
"detail": [ |
||||
] |
{ |
||||
} |
"type": "missing", |
||||
|
"loc": ["query", "query"], |
||||
response_not_valid_int = { |
"msg": "Field required", |
||||
"detail": [ |
"input": None, |
||||
{ |
"url": match_pydantic_error_url("missing"), |
||||
"loc": ["query", "query"], |
} |
||||
"msg": "value is not a valid integer", |
] |
||||
"type": "type_error.integer", |
} |
||||
} |
) | IsDict( |
||||
] |
# TODO: remove when deprecating Pydantic v1 |
||||
} |
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
@pytest.mark.parametrize( |
"loc": ["query", "query"], |
||||
"path,expected_status,expected_response", |
"msg": "field required", |
||||
[ |
"type": "value_error.missing", |
||||
("/query", 422, response_missing), |
} |
||||
("/query?query=baz", 200, "foo bar baz"), |
] |
||||
("/query?not_declared=baz", 422, response_missing), |
} |
||||
("/query/optional", 200, "foo bar"), |
) |
||||
("/query/optional?query=baz", 200, "foo bar baz"), |
|
||||
("/query/optional?not_declared=baz", 200, "foo bar"), |
|
||||
("/query/int", 422, response_missing), |
def test_query_query_baz(): |
||||
("/query/int?query=42", 200, "foo bar 42"), |
response = client.get("/query?query=baz") |
||||
("/query/int?query=42.5", 422, response_not_valid_int), |
assert response.status_code == 200 |
||||
("/query/int?query=baz", 422, response_not_valid_int), |
assert response.json() == "foo bar baz" |
||||
("/query/int?not_declared=baz", 422, response_missing), |
|
||||
("/query/int/optional", 200, "foo bar"), |
|
||||
("/query/int/optional?query=50", 200, "foo bar 50"), |
def test_query_not_declared_baz(): |
||||
("/query/int/optional?query=foo", 422, response_not_valid_int), |
response = client.get("/query?not_declared=baz") |
||||
("/query/int/default", 200, "foo bar 10"), |
assert response.status_code == 422 |
||||
("/query/int/default?query=50", 200, "foo bar 50"), |
assert response.json() == IsDict( |
||||
("/query/int/default?query=foo", 422, response_not_valid_int), |
{ |
||||
("/query/param", 200, "foo bar"), |
"detail": [ |
||||
("/query/param?query=50", 200, "foo bar 50"), |
{ |
||||
("/query/param-required", 422, response_missing), |
"type": "missing", |
||||
("/query/param-required?query=50", 200, "foo bar 50"), |
"loc": ["query", "query"], |
||||
("/query/param-required/int", 422, response_missing), |
"msg": "Field required", |
||||
("/query/param-required/int?query=50", 200, "foo bar 50"), |
"input": None, |
||||
("/query/param-required/int?query=foo", 422, response_not_valid_int), |
"url": match_pydantic_error_url("missing"), |
||||
("/query/frozenset/?query=1&query=1&query=2", 200, "1,2"), |
} |
||||
], |
] |
||||
) |
} |
||||
def test_get_path(path, expected_status, expected_response): |
) | IsDict( |
||||
response = client.get(path) |
# TODO: remove when deprecating Pydantic v1 |
||||
assert response.status_code == expected_status |
{ |
||||
assert response.json() == expected_response |
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "field required", |
||||
|
"type": "value_error.missing", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_optional(): |
||||
|
response = client.get("/query/optional") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar" |
||||
|
|
||||
|
|
||||
|
def test_query_optional_query_baz(): |
||||
|
response = client.get("/query/optional?query=baz") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar baz" |
||||
|
|
||||
|
|
||||
|
def test_query_optional_not_declared_baz(): |
||||
|
response = client.get("/query/optional?not_declared=baz") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar" |
||||
|
|
||||
|
|
||||
|
def test_query_int(): |
||||
|
response = client.get("/query/int") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "missing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Field required", |
||||
|
"input": None, |
||||
|
"url": match_pydantic_error_url("missing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "field required", |
||||
|
"type": "value_error.missing", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_int_query_42(): |
||||
|
response = client.get("/query/int?query=42") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar 42" |
||||
|
|
||||
|
|
||||
|
def test_query_int_query_42_5(): |
||||
|
response = client.get("/query/int?query=42.5") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "int_parsing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Input should be a valid integer, unable to parse string as an integer", |
||||
|
"input": "42.5", |
||||
|
"url": match_pydantic_error_url("int_parsing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "value is not a valid integer", |
||||
|
"type": "type_error.integer", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_int_query_baz(): |
||||
|
response = client.get("/query/int?query=baz") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "int_parsing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Input should be a valid integer, unable to parse string as an integer", |
||||
|
"input": "baz", |
||||
|
"url": match_pydantic_error_url("int_parsing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "value is not a valid integer", |
||||
|
"type": "type_error.integer", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_int_not_declared_baz(): |
||||
|
response = client.get("/query/int?not_declared=baz") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "missing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Field required", |
||||
|
"input": None, |
||||
|
"url": match_pydantic_error_url("missing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "field required", |
||||
|
"type": "value_error.missing", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_int_optional(): |
||||
|
response = client.get("/query/int/optional") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar" |
||||
|
|
||||
|
|
||||
|
def test_query_int_optional_query_50(): |
||||
|
response = client.get("/query/int/optional?query=50") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar 50" |
||||
|
|
||||
|
|
||||
|
def test_query_int_optional_query_foo(): |
||||
|
response = client.get("/query/int/optional?query=foo") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "int_parsing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Input should be a valid integer, unable to parse string as an integer", |
||||
|
"input": "foo", |
||||
|
"url": match_pydantic_error_url("int_parsing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "value is not a valid integer", |
||||
|
"type": "type_error.integer", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_int_default(): |
||||
|
response = client.get("/query/int/default") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar 10" |
||||
|
|
||||
|
|
||||
|
def test_query_int_default_query_50(): |
||||
|
response = client.get("/query/int/default?query=50") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar 50" |
||||
|
|
||||
|
|
||||
|
def test_query_int_default_query_foo(): |
||||
|
response = client.get("/query/int/default?query=foo") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "int_parsing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Input should be a valid integer, unable to parse string as an integer", |
||||
|
"input": "foo", |
||||
|
"url": match_pydantic_error_url("int_parsing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "value is not a valid integer", |
||||
|
"type": "type_error.integer", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_param(): |
||||
|
response = client.get("/query/param") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar" |
||||
|
|
||||
|
|
||||
|
def test_query_param_query_50(): |
||||
|
response = client.get("/query/param?query=50") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar 50" |
||||
|
|
||||
|
|
||||
|
def test_query_param_required(): |
||||
|
response = client.get("/query/param-required") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "missing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Field required", |
||||
|
"input": None, |
||||
|
"url": match_pydantic_error_url("missing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "field required", |
||||
|
"type": "value_error.missing", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_param_required_query_50(): |
||||
|
response = client.get("/query/param-required?query=50") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar 50" |
||||
|
|
||||
|
|
||||
|
def test_query_param_required_int(): |
||||
|
response = client.get("/query/param-required/int") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "missing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Field required", |
||||
|
"input": None, |
||||
|
"url": match_pydantic_error_url("missing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "field required", |
||||
|
"type": "value_error.missing", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_param_required_int_query_50(): |
||||
|
response = client.get("/query/param-required/int?query=50") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "foo bar 50" |
||||
|
|
||||
|
|
||||
|
def test_query_param_required_int_query_foo(): |
||||
|
response = client.get("/query/param-required/int?query=foo") |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "int_parsing", |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "Input should be a valid integer, unable to parse string as an integer", |
||||
|
"input": "foo", |
||||
|
"url": match_pydantic_error_url("int_parsing"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"loc": ["query", "query"], |
||||
|
"msg": "value is not a valid integer", |
||||
|
"type": "type_error.integer", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
def test_query_frozenset_query_1_query_1_query_2(): |
||||
|
response = client.get("/query/frozenset/?query=1&query=1&query=2") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "1,2" |
||||
|
@ -0,0 +1,182 @@ |
|||||
|
import pytest |
||||
|
from dirty_equals import IsDict |
||||
|
from fastapi import FastAPI, Form |
||||
|
from fastapi.testclient import TestClient |
||||
|
from fastapi.utils import match_pydantic_error_url |
||||
|
from typing_extensions import Annotated |
||||
|
|
||||
|
from .utils import needs_py310 |
||||
|
|
||||
|
|
||||
|
def get_client(): |
||||
|
app = FastAPI() |
||||
|
with pytest.warns(DeprecationWarning): |
||||
|
|
||||
|
@app.post("/items/") |
||||
|
async def read_items( |
||||
|
q: Annotated[str | None, Form(regex="^fixedquery$")] = None |
||||
|
): |
||||
|
if q: |
||||
|
return f"Hello {q}" |
||||
|
else: |
||||
|
return "Hello World" |
||||
|
|
||||
|
client = TestClient(app) |
||||
|
return client |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_no_query(): |
||||
|
client = get_client() |
||||
|
response = client.post("/items/") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "Hello World" |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_q_fixedquery(): |
||||
|
client = get_client() |
||||
|
response = client.post("/items/", data={"q": "fixedquery"}) |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "Hello fixedquery" |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_query_nonregexquery(): |
||||
|
client = get_client() |
||||
|
response = client.post("/items/", data={"q": "nonregexquery"}) |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "string_pattern_mismatch", |
||||
|
"loc": ["body", "q"], |
||||
|
"msg": "String should match pattern '^fixedquery$'", |
||||
|
"input": "nonregexquery", |
||||
|
"ctx": {"pattern": "^fixedquery$"}, |
||||
|
"url": match_pydantic_error_url("string_pattern_mismatch"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"ctx": {"pattern": "^fixedquery$"}, |
||||
|
"loc": ["body", "q"], |
||||
|
"msg": 'string does not match regex "^fixedquery$"', |
||||
|
"type": "value_error.str.regex", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_openapi_schema(): |
||||
|
client = get_client() |
||||
|
response = client.get("/openapi.json") |
||||
|
assert response.status_code == 200, response.text |
||||
|
# insert_assert(response.json()) |
||||
|
assert response.json() == { |
||||
|
"openapi": "3.1.0", |
||||
|
"info": {"title": "FastAPI", "version": "0.1.0"}, |
||||
|
"paths": { |
||||
|
"/items/": { |
||||
|
"post": { |
||||
|
"summary": "Read Items", |
||||
|
"operationId": "read_items_items__post", |
||||
|
"requestBody": { |
||||
|
"content": { |
||||
|
"application/x-www-form-urlencoded": { |
||||
|
"schema": IsDict( |
||||
|
{ |
||||
|
"allOf": [ |
||||
|
{ |
||||
|
"$ref": "#/components/schemas/Body_read_items_items__post" |
||||
|
} |
||||
|
], |
||||
|
"title": "Body", |
||||
|
} |
||||
|
) |
||||
|
| IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"$ref": "#/components/schemas/Body_read_items_items__post" |
||||
|
} |
||||
|
) |
||||
|
} |
||||
|
} |
||||
|
}, |
||||
|
"responses": { |
||||
|
"200": { |
||||
|
"description": "Successful Response", |
||||
|
"content": {"application/json": {"schema": {}}}, |
||||
|
}, |
||||
|
"422": { |
||||
|
"description": "Validation Error", |
||||
|
"content": { |
||||
|
"application/json": { |
||||
|
"schema": { |
||||
|
"$ref": "#/components/schemas/HTTPValidationError" |
||||
|
} |
||||
|
} |
||||
|
}, |
||||
|
}, |
||||
|
}, |
||||
|
} |
||||
|
} |
||||
|
}, |
||||
|
"components": { |
||||
|
"schemas": { |
||||
|
"Body_read_items_items__post": { |
||||
|
"properties": { |
||||
|
"q": IsDict( |
||||
|
{ |
||||
|
"anyOf": [ |
||||
|
{"type": "string", "pattern": "^fixedquery$"}, |
||||
|
{"type": "null"}, |
||||
|
], |
||||
|
"title": "Q", |
||||
|
} |
||||
|
) |
||||
|
| IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{"type": "string", "pattern": "^fixedquery$", "title": "Q"} |
||||
|
) |
||||
|
}, |
||||
|
"type": "object", |
||||
|
"title": "Body_read_items_items__post", |
||||
|
}, |
||||
|
"HTTPValidationError": { |
||||
|
"properties": { |
||||
|
"detail": { |
||||
|
"items": {"$ref": "#/components/schemas/ValidationError"}, |
||||
|
"type": "array", |
||||
|
"title": "Detail", |
||||
|
} |
||||
|
}, |
||||
|
"type": "object", |
||||
|
"title": "HTTPValidationError", |
||||
|
}, |
||||
|
"ValidationError": { |
||||
|
"properties": { |
||||
|
"loc": { |
||||
|
"items": { |
||||
|
"anyOf": [{"type": "string"}, {"type": "integer"}] |
||||
|
}, |
||||
|
"type": "array", |
||||
|
"title": "Location", |
||||
|
}, |
||||
|
"msg": {"type": "string", "title": "Message"}, |
||||
|
"type": {"type": "string", "title": "Error Type"}, |
||||
|
}, |
||||
|
"type": "object", |
||||
|
"required": ["loc", "msg", "type"], |
||||
|
"title": "ValidationError", |
||||
|
}, |
||||
|
} |
||||
|
}, |
||||
|
} |
@ -0,0 +1,165 @@ |
|||||
|
import pytest |
||||
|
from dirty_equals import IsDict |
||||
|
from fastapi import FastAPI, Query |
||||
|
from fastapi.testclient import TestClient |
||||
|
from fastapi.utils import match_pydantic_error_url |
||||
|
from typing_extensions import Annotated |
||||
|
|
||||
|
from .utils import needs_py310 |
||||
|
|
||||
|
|
||||
|
def get_client(): |
||||
|
app = FastAPI() |
||||
|
with pytest.warns(DeprecationWarning): |
||||
|
|
||||
|
@app.get("/items/") |
||||
|
async def read_items( |
||||
|
q: Annotated[str | None, Query(regex="^fixedquery$")] = None |
||||
|
): |
||||
|
if q: |
||||
|
return f"Hello {q}" |
||||
|
else: |
||||
|
return "Hello World" |
||||
|
|
||||
|
client = TestClient(app) |
||||
|
return client |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_query_params_str_validations_no_query(): |
||||
|
client = get_client() |
||||
|
response = client.get("/items/") |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "Hello World" |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_query_params_str_validations_q_fixedquery(): |
||||
|
client = get_client() |
||||
|
response = client.get("/items/", params={"q": "fixedquery"}) |
||||
|
assert response.status_code == 200 |
||||
|
assert response.json() == "Hello fixedquery" |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_query_params_str_validations_item_query_nonregexquery(): |
||||
|
client = get_client() |
||||
|
response = client.get("/items/", params={"q": "nonregexquery"}) |
||||
|
assert response.status_code == 422 |
||||
|
assert response.json() == IsDict( |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"type": "string_pattern_mismatch", |
||||
|
"loc": ["query", "q"], |
||||
|
"msg": "String should match pattern '^fixedquery$'", |
||||
|
"input": "nonregexquery", |
||||
|
"ctx": {"pattern": "^fixedquery$"}, |
||||
|
"url": match_pydantic_error_url("string_pattern_mismatch"), |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) | IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"detail": [ |
||||
|
{ |
||||
|
"ctx": {"pattern": "^fixedquery$"}, |
||||
|
"loc": ["query", "q"], |
||||
|
"msg": 'string does not match regex "^fixedquery$"', |
||||
|
"type": "value_error.str.regex", |
||||
|
} |
||||
|
] |
||||
|
} |
||||
|
) |
||||
|
|
||||
|
|
||||
|
@needs_py310 |
||||
|
def test_openapi_schema(): |
||||
|
client = get_client() |
||||
|
response = client.get("/openapi.json") |
||||
|
assert response.status_code == 200, response.text |
||||
|
# insert_assert(response.json()) |
||||
|
assert response.json() == { |
||||
|
"openapi": "3.1.0", |
||||
|
"info": {"title": "FastAPI", "version": "0.1.0"}, |
||||
|
"paths": { |
||||
|
"/items/": { |
||||
|
"get": { |
||||
|
"summary": "Read Items", |
||||
|
"operationId": "read_items_items__get", |
||||
|
"parameters": [ |
||||
|
{ |
||||
|
"name": "q", |
||||
|
"in": "query", |
||||
|
"required": False, |
||||
|
"schema": IsDict( |
||||
|
{ |
||||
|
"anyOf": [ |
||||
|
{"type": "string", "pattern": "^fixedquery$"}, |
||||
|
{"type": "null"}, |
||||
|
], |
||||
|
"title": "Q", |
||||
|
} |
||||
|
) |
||||
|
| IsDict( |
||||
|
# TODO: remove when deprecating Pydantic v1 |
||||
|
{ |
||||
|
"type": "string", |
||||
|
"pattern": "^fixedquery$", |
||||
|
"title": "Q", |
||||
|
} |
||||
|
), |
||||
|
} |
||||
|
], |
||||
|
"responses": { |
||||
|
"200": { |
||||
|
"description": "Successful Response", |
||||
|
"content": {"application/json": {"schema": {}}}, |
||||
|
}, |
||||
|
"422": { |
||||
|
"description": "Validation Error", |
||||
|
"content": { |
||||
|
"application/json": { |
||||
|
"schema": { |
||||
|
"$ref": "#/components/schemas/HTTPValidationError" |
||||
|
} |
||||
|
} |
||||
|
}, |
||||
|
}, |
||||
|
}, |
||||
|
} |
||||
|
} |
||||
|
}, |
||||
|
"components": { |
||||
|
"schemas": { |
||||
|
"HTTPValidationError": { |
||||
|
"properties": { |
||||
|
"detail": { |
||||
|
"items": {"$ref": "#/components/schemas/ValidationError"}, |
||||
|
"type": "array", |
||||
|
"title": "Detail", |
||||
|
} |
||||
|
}, |
||||
|
"type": "object", |
||||
|
"title": "HTTPValidationError", |
||||
|
}, |
||||
|
"ValidationError": { |
||||
|
"properties": { |
||||
|
"loc": { |
||||
|
"items": { |
||||
|
"anyOf": [{"type": "string"}, {"type": "integer"}] |
||||
|
}, |
||||
|
"type": "array", |
||||
|
"title": "Location", |
||||
|
}, |
||||
|
"msg": {"type": "string", "title": "Message"}, |
||||
|
"type": {"type": "string", "title": "Error Type"}, |
||||
|
}, |
||||
|
"type": "object", |
||||
|
"required": ["loc", "msg", "type"], |
||||
|
"title": "ValidationError", |
||||
|
}, |
||||
|
} |
||||
|
}, |
||||
|
} |
@ -0,0 +1,81 @@ |
|||||
|
from typing import List |
||||
|
|
||||
|
from fastapi import FastAPI |
||||
|
from fastapi.testclient import TestClient |
||||
|
from pydantic import BaseModel |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
class UserBase(BaseModel): |
||||
|
email: str |
||||
|
|
||||
|
|
||||
|
class UserCreate(UserBase): |
||||
|
password: str |
||||
|
|
||||
|
|
||||
|
class UserDB(UserBase): |
||||
|
hashed_password: str |
||||
|
|
||||
|
|
||||
|
class PetDB(BaseModel): |
||||
|
name: str |
||||
|
owner: UserDB |
||||
|
|
||||
|
|
||||
|
class PetOut(BaseModel): |
||||
|
name: str |
||||
|
owner: UserBase |
||||
|
|
||||
|
|
||||
|
@app.post("/users/", response_model=UserBase) |
||||
|
async def create_user(user: UserCreate): |
||||
|
return user |
||||
|
|
||||
|
|
||||
|
@app.get("/pets/{pet_id}", response_model=PetOut) |
||||
|
async def read_pet(pet_id: int): |
||||
|
user = UserDB( |
||||
|
email="[email protected]", |
||||
|
hashed_password="secrethashed", |
||||
|
) |
||||
|
pet = PetDB(name="Nibbler", owner=user) |
||||
|
return pet |
||||
|
|
||||
|
|
||||
|
@app.get("/pets/", response_model=List[PetOut]) |
||||
|
async def read_pets(): |
||||
|
user = UserDB( |
||||
|
email="[email protected]", |
||||
|
hashed_password="secrethashed", |
||||
|
) |
||||
|
pet1 = PetDB(name="Nibbler", owner=user) |
||||
|
pet2 = PetDB(name="Zoidberg", owner=user) |
||||
|
return [pet1, pet2] |
||||
|
|
||||
|
|
||||
|
client = TestClient(app) |
||||
|
|
||||
|
|
||||
|
def test_filter_top_level_model(): |
||||
|
response = client.post( |
||||
|
"/users", json={"email": "[email protected]", "password": "secret"} |
||||
|
) |
||||
|
assert response.json() == {"email": "[email protected]"} |
||||
|
|
||||
|
|
||||
|
def test_filter_second_level_model(): |
||||
|
response = client.get("/pets/1") |
||||
|
assert response.json() == { |
||||
|
"name": "Nibbler", |
||||
|
"owner": {"email": "[email protected]"}, |
||||
|
} |
||||
|
|
||||
|
|
||||
|
def test_list_of_models(): |
||||
|
response = client.get("/pets/") |
||||
|
assert response.json() == [ |
||||
|
{"name": "Nibbler", "owner": {"email": "[email protected]"}}, |
||||
|
{"name": "Zoidberg", "owner": {"email": "[email protected]"}}, |
||||
|
] |
@ -0,0 +1,83 @@ |
|||||
|
from typing import List |
||||
|
|
||||
|
from fastapi import FastAPI |
||||
|
from fastapi.testclient import TestClient |
||||
|
from pydantic import BaseModel |
||||
|
|
||||
|
app = FastAPI() |
||||
|
|
||||
|
|
||||
|
class UserCreate(BaseModel): |
||||
|
email: str |
||||
|
password: str |
||||
|
|
||||
|
|
||||
|
class UserDB(BaseModel): |
||||
|
email: str |
||||
|
hashed_password: str |
||||
|
|
||||
|
|
||||
|
class User(BaseModel): |
||||
|
email: str |
||||
|
|
||||
|
|
||||
|
class PetDB(BaseModel): |
||||
|
name: str |
||||
|
owner: UserDB |
||||
|
|
||||
|
|
||||
|
class PetOut(BaseModel): |
||||
|
name: str |
||||
|
owner: User |
||||
|
|
||||
|
|
||||
|
@app.post("/users/", response_model=User) |
||||
|
async def create_user(user: UserCreate): |
||||
|
return user |
||||
|
|
||||
|
|
||||
|
@app.get("/pets/{pet_id}", response_model=PetOut) |
||||
|
async def read_pet(pet_id: int): |
||||
|
user = UserDB( |
||||
|
email="[email protected]", |
||||
|
hashed_password="secrethashed", |
||||
|
) |
||||
|
pet = PetDB(name="Nibbler", owner=user) |
||||
|
return pet |
||||
|
|
||||
|
|
||||
|
@app.get("/pets/", response_model=List[PetOut]) |
||||
|
async def read_pets(): |
||||
|
user = UserDB( |
||||
|
email="[email protected]", |
||||
|
hashed_password="secrethashed", |
||||
|
) |
||||
|
pet1 = PetDB(name="Nibbler", owner=user) |
||||
|
pet2 = PetDB(name="Zoidberg", owner=user) |
||||
|
return [pet1, pet2] |
||||
|
|
||||
|
|
||||
|
client = TestClient(app) |
||||
|
|
||||
|
|
||||
|
def test_filter_top_level_model(): |
||||
|
response = client.post( |
||||
|
"/users", json={"email": "[email protected]", "password": "secret"} |
||||
|
) |
||||
|
assert response.json() == {"email": "[email protected]"} |
||||
|
|
||||
|
|
||||
|
def test_filter_second_level_model(): |
||||
|
response = client.get("/pets/1") |
||||
|
assert response.json() == { |
||||
|
"name": "Nibbler", |
||||
|
"owner": {"email": "[email protected]"}, |
||||
|
} |
||||
|
|
||||
|
|
||||
|
def test_list_of_models(): |
||||
|
response = client.get("/pets/") |
||||
|
assert response.json() == [ |
||||
|
{"name": "Nibbler", "owner": {"email": "[email protected]"}}, |
||||
|
{"name": "Zoidberg", "owner": {"email": "[email protected]"}}, |
||||
|
] |
Some files were not shown because too many files changed in this diff
Loading…
Reference in new issue