Jump to solution
Verify

The Fix

Fixes a memory leak issue in Pydantic's model rebuild process by preventing the reuse of prebuilt serializers and validators during forced rebuilds.

Based on closed pydantic/pydantic issue #12446 · PR/commit linked

Jump to Verify Open PR/Commit
@@ -16,7 +16,7 @@ fn build_schema_validator_with_globals( ) -> SchemaValidator { let schema = py.eval(code, globals, None).unwrap().extract().unwrap(); - SchemaValidator::py_new(py, &schema, None).unwrap() + SchemaValidator::py_new(py, &schema, None, true).unwrap() }
repro.py
from typing import Annotated, List, Literal, Union from pydantic import BaseModel, Field Block = Annotated[ Union[ "Collection", "BlockGroup", "PredictiveModel", ], Field(discriminator="type"), ] class PredictiveModel(BaseModel): type: Literal["predictive_model"] result: str class BlockGroup(BaseModel): type: Literal["block_group"] blocks: List[Block] = [] class Collection(BaseModel): type: Literal["collection"] before_run: List[Block] = [] after_run: List[Block] = [] def test_rebuild(): i = 0 while True: Collection.model_rebuild(force=True) BlockGroup.model_rebuild(force=True) PredictiveModel.model_rebuild(force=True) print(i) i += 1
verify
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
fix.md
Option A — Apply the official fix\nFixes a memory leak issue in Pydantic's model rebuild process by preventing the reuse of prebuilt serializers and validators during forced rebuilds.\nWhen NOT to use: Do not use this fix if your models do not have recursive dependencies.\n\nOption C — Workaround\nI spotted that can be used with `force=True` to force not to use a prebuilt validator (the RSS regression) like so:\nWhen NOT to use: Do not use this fix if your models do not have recursive dependencies.\n\n

Why This Fix Works in Production

  • Trigger: Memory leak in model_rebuild(force=True)
  • Mechanism: Memory leak occurs due to strong Arc references in circular patterns within validators

Why This Breaks in Prod

  • Memory leak occurs due to strong Arc references in circular patterns within validators
  • Production symptom (often without a traceback): Memory leak in model_rebuild(force=True)

Proof / Evidence

Discussion

High-signal excerpts from the issue thread (symptoms, repros, edge-cases).

“Ah, it appears this is actually 2 independent regressions, the one in #1870 (stale referents on rebuild) and another one from 2.10.6 to 2.11.0 (the…”
@lmmx · 2026-01-10 · confirmation · source
“Hi, I follow some memory usage issues on pydantic for some times, I'm using Django Ninja in my work project and since the 2.11 release…”
@M3te0r · 2026-01-10 · source
“Thanks for looking into this. I can confirm the leak using memray. I'll look into this this week.”
@Viicos · 2025-11-10 · source
“Thanks for the great analysis”
@davidhewitt · 2026-01-12 · source

Failure Signature (Search String)

  • Memory leak in model_rebuild(force=True)
  • There appears to be a memory leak when calling `model_rebuild(force=True)` on models that have recursive dependencies.
Copy-friendly signature
signature.txt
Failure Signature ----------------- Memory leak in model_rebuild(force=True) There appears to be a memory leak when calling `model_rebuild(force=True)` on models that have recursive dependencies.

Error Message

Signature-only (no traceback captured)
error.txt
Error Message ------------- Memory leak in model_rebuild(force=True) There appears to be a memory leak when calling `model_rebuild(force=True)` on models that have recursive dependencies.

Minimal Reproduction

repro.py
from typing import Annotated, List, Literal, Union from pydantic import BaseModel, Field Block = Annotated[ Union[ "Collection", "BlockGroup", "PredictiveModel", ], Field(discriminator="type"), ] class PredictiveModel(BaseModel): type: Literal["predictive_model"] result: str class BlockGroup(BaseModel): type: Literal["block_group"] blocks: List[Block] = [] class Collection(BaseModel): type: Literal["collection"] before_run: List[Block] = [] after_run: List[Block] = [] def test_rebuild(): i = 0 while True: Collection.model_rebuild(force=True) BlockGroup.model_rebuild(force=True) PredictiveModel.model_rebuild(force=True) print(i) i += 1

Environment

  • Pydantic: 2

What Broke

Gradual memory increase leading to potential out-of-memory errors during model rebuilds.

Why It Broke

Memory leak occurs due to strong Arc references in circular patterns within validators

Fix Options (Details)

Option A — Apply the official fix

Fixes a memory leak issue in Pydantic's model rebuild process by preventing the reuse of prebuilt serializers and validators during forced rebuilds.

When NOT to use: Do not use this fix if your models do not have recursive dependencies.

Option C — Workaround Temporary workaround

I spotted that can be used with `force=True` to force not to use a prebuilt validator (the RSS regression) like so:

When NOT to use: Do not use this fix if your models do not have recursive dependencies.

Use only if you cannot change versions today. Treat this as a stopgap and remove once upgraded.

Fix reference: https://github.com/pydantic/pydantic/pull/12689

Last verified: 2026-02-09. Validate in your environment.

Get updates

We publish verified fixes weekly. No spam.

Subscribe

When NOT to Use This Fix

  • Do not use this fix if your models do not have recursive dependencies.

Verify Fix

verify
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.

Did This Fix Work in Your Case?

Quick signal helps us prioritize which fixes to verify and improve.

Prevention

  • Add a CI check that diffs key outputs after upgrades (OpenAPI schema snapshots, JSON payload shapes, CLI output).
  • Upgrade behind a canary and run integration tests against the canary before 100% rollout.
  • Track RSS + object counts after deployments; alert on monotonic growth and GC pressure.
  • Add a long-running test that repeats the failing call path and asserts stable memory.

Related Issues

No related fixes found.

Sources

We don’t republish the full GitHub discussion text. Use the links above for context.