The Fix
Fixes a memory leak issue in Pydantic's model rebuild process by preventing the reuse of prebuilt serializers and validators during forced rebuilds.
Based on closed pydantic/pydantic issue #12446 · PR/commit linked
@@ -16,7 +16,7 @@ fn build_schema_validator_with_globals(
) -> SchemaValidator {
let schema = py.eval(code, globals, None).unwrap().extract().unwrap();
- SchemaValidator::py_new(py, &schema, None).unwrap()
+ SchemaValidator::py_new(py, &schema, None, true).unwrap()
}
from typing import Annotated, List, Literal, Union
from pydantic import BaseModel, Field
Block = Annotated[
Union[
"Collection",
"BlockGroup",
"PredictiveModel",
],
Field(discriminator="type"),
]
class PredictiveModel(BaseModel):
type: Literal["predictive_model"]
result: str
class BlockGroup(BaseModel):
type: Literal["block_group"]
blocks: List[Block] = []
class Collection(BaseModel):
type: Literal["collection"]
before_run: List[Block] = []
after_run: List[Block] = []
def test_rebuild():
i = 0
while True:
Collection.model_rebuild(force=True)
BlockGroup.model_rebuild(force=True)
PredictiveModel.model_rebuild(force=True)
print(i)
i += 1
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Option A — Apply the official fix\nFixes a memory leak issue in Pydantic's model rebuild process by preventing the reuse of prebuilt serializers and validators during forced rebuilds.\nWhen NOT to use: Do not use this fix if your models do not have recursive dependencies.\n\nOption C — Workaround\nI spotted that can be used with `force=True` to force not to use a prebuilt validator (the RSS regression) like so:\nWhen NOT to use: Do not use this fix if your models do not have recursive dependencies.\n\n
Why This Fix Works in Production
- Trigger: Memory leak in model_rebuild(force=True)
- Mechanism: Memory leak occurs due to strong Arc references in circular patterns within validators
Why This Breaks in Prod
- Memory leak occurs due to strong Arc references in circular patterns within validators
- Production symptom (often without a traceback): Memory leak in model_rebuild(force=True)
Proof / Evidence
- GitHub issue: #12446
- Fix PR: https://github.com/pydantic/pydantic/pull/12689
- Reproduced locally: No (not executed)
- Last verified: 2026-02-09
- Confidence: 0.60
- Did this fix it?: Yes (upstream fix exists)
- Own content ratio: 0.56
Discussion
High-signal excerpts from the issue thread (symptoms, repros, edge-cases).
“Ah, it appears this is actually 2 independent regressions, the one in #1870 (stale referents on rebuild) and another one from 2.10.6 to 2.11.0 (the…”
“Hi, I follow some memory usage issues on pydantic for some times, I'm using Django Ninja in my work project and since the 2.11 release…”
“Thanks for looking into this. I can confirm the leak using memray. I'll look into this this week.”
“Thanks for the great analysis”
Failure Signature (Search String)
- Memory leak in model_rebuild(force=True)
- There appears to be a memory leak when calling `model_rebuild(force=True)` on models that have recursive dependencies.
Copy-friendly signature
Failure Signature
-----------------
Memory leak in model_rebuild(force=True)
There appears to be a memory leak when calling `model_rebuild(force=True)` on models that have recursive dependencies.
Error Message
Signature-only (no traceback captured)
Error Message
-------------
Memory leak in model_rebuild(force=True)
There appears to be a memory leak when calling `model_rebuild(force=True)` on models that have recursive dependencies.
Minimal Reproduction
from typing import Annotated, List, Literal, Union
from pydantic import BaseModel, Field
Block = Annotated[
Union[
"Collection",
"BlockGroup",
"PredictiveModel",
],
Field(discriminator="type"),
]
class PredictiveModel(BaseModel):
type: Literal["predictive_model"]
result: str
class BlockGroup(BaseModel):
type: Literal["block_group"]
blocks: List[Block] = []
class Collection(BaseModel):
type: Literal["collection"]
before_run: List[Block] = []
after_run: List[Block] = []
def test_rebuild():
i = 0
while True:
Collection.model_rebuild(force=True)
BlockGroup.model_rebuild(force=True)
PredictiveModel.model_rebuild(force=True)
print(i)
i += 1
Environment
- Pydantic: 2
What Broke
Gradual memory increase leading to potential out-of-memory errors during model rebuilds.
Why It Broke
Memory leak occurs due to strong Arc references in circular patterns within validators
Fix Options (Details)
Option A — Apply the official fix
Fixes a memory leak issue in Pydantic's model rebuild process by preventing the reuse of prebuilt serializers and validators during forced rebuilds.
Option C — Workaround Temporary workaround
I spotted that can be used with `force=True` to force not to use a prebuilt validator (the RSS regression) like so:
Use only if you cannot change versions today. Treat this as a stopgap and remove once upgraded.
Fix reference: https://github.com/pydantic/pydantic/pull/12689
Last verified: 2026-02-09. Validate in your environment.
When NOT to Use This Fix
- Do not use this fix if your models do not have recursive dependencies.
Verify Fix
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Did This Fix Work in Your Case?
Quick signal helps us prioritize which fixes to verify and improve.
Prevention
- Add a CI check that diffs key outputs after upgrades (OpenAPI schema snapshots, JSON payload shapes, CLI output).
- Upgrade behind a canary and run integration tests against the canary before 100% rollout.
- Track RSS + object counts after deployments; alert on monotonic growth and GC pressure.
- Add a long-running test that repeats the failing call path and asserts stable memory.
Related Issues
No related fixes found.
Sources
We don’t republish the full GitHub discussion text. Use the links above for context.