Jump to solution
Verify

The Fix

pip install urllib3==2.6.2

Based on closed urllib3/urllib3 issue #3734 · PR/commit linked

Production note: Watch p95/p99 latency and retry volume; timeouts can turn into retry storms and duplicate side-effects.

Jump to Verify Open PR/Commit
@@ -0,0 +1,2 @@ @@ -0,0 +1,2 @@ +Fixed ``HTTPResponse.read_chunked()`` to properly handle leftover data in +the decoder's buffer when reading compressed chunked responses. diff --git a/src/urllib3/response.py b/src/urllib3/response.py
repro.py
#!/usr/bin/env python3 """Minimal reproduction: brotli decode bug with urllib3 2.6.x + brotli 1.2.0""" import hashlib import socket import threading import brotli import requests def main() -> int: from importlib.metadata import version print(f"urllib3: {version('urllib3')}, brotli: {version('brotli')}") # Generate ~15MB data with moderate compressibility (~27x ratio) data = b"".join( f"{hashlib.sha256(str(i).encode()).hexdigest()}{'a' * 900}{i:06d}\n".encode() for i in range(15000) ) compressed = brotli.compress(data) print(f"Data: {len(data):,} -> {len(compressed):,} bytes ({len(data) // len(compressed)}x)") # Build chunked HTTP response resp = b"HTTP/1.1 200 OK\r\nContent-Encoding: br\r\nTransfer-Encoding: chunked\r\n\r\n" for i in range(0, len(compressed), 32768): chunk = compressed[i : i + 32768] resp += f"{len(chunk):x}\r\n".encode() + chunk + b"\r\n" resp += b"0\r\n\r\n" # Start mock server ready = threading.Event() def serve(port: int) -> None: s = socket.socket() s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) s.bind(("127.0.0.1", port)) s.listen(1) ready.set() c, _ = s.accept() c.recv(4096) for i in range(0, len(resp), 128): # Small chunks trigger bug c.send(resp[i : i + 128]) c.close() s.close() threading.Thread(target=serve, args=(18765,), daemon=True).start() ready.wait() try: r = requests.get("http://127.0.0.1:18765/", timeout=60) print(f"SUCCESS: {len(r.content):,} bytes") return 0 except requests.exceptions.ContentDecodingError as e: print(f"FAILED: {e}") return 1 if __name__ == "__main__": exit(main())
verify
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
fix.md
Option A — Upgrade to fixed release\npip install urllib3==2.6.2\nWhen NOT to use: This fix should not be used if the application relies on the previous behavior of decoding.\n\n

Why This Fix Works in Production

  • Trigger: urllib3.exceptions.DecodeError: ('Received response with content-encoding: br, but
  • Mechanism: Fixes a DecodeError in HTTPResponse.read_chunked when leftover data is present in the decoder's buffer.
  • Why the fix works: Fixes a DecodeError in HTTPResponse.read_chunked when leftover data is present in the decoder's buffer. (first fixed release: 2.6.2).
Production impact:
  • If left unfixed, this can cause silent data inconsistencies that propagate (bad cache entries, incorrect downstream decisions).

Why This Breaks in Prod

  • Triggered by an upgrade/regression window: 2.0.0 breaks; 2.6.2 is the first fixed release.
  • Shows up under Python 3.12.12 in real deployments (not just unit tests).
  • Surfaces as: urllib3.exceptions.DecodeError: ('Received response with content-encoding: br, but

Proof / Evidence

  • GitHub issue: #3734
  • Fix PR: https://github.com/urllib3/urllib3/pull/3736
  • First fixed release: 2.6.2
  • Affected versions: 2.0.0
  • Reproduced locally: No (not executed)
  • Last verified: 2026-02-09
  • Confidence: 0.95
  • Did this fix it?: Yes (upstream fix exists)
  • Own content ratio: 0.38

Discussion

High-signal excerpts from the issue thread (symptoms, repros, edge-cases).

“Please check if #3736 fixes your issue. Thanks for the reproducer, it doesn't fail with the fix.”
@illia-v · 2025-12-09 · source
“@Cycloctane that's the same logic as in my fix, thanks 👍🏻 I reused your approach to simplify #3736.”
@illia-v · 2025-12-09 · source
“So for what it's worth I'm experiencing this on Debian sid with: So it might be more related to the brotli update than an urllib…”
@corsac-s · 2025-12-09 · source
“Indeed this seems to be an error related to brotli 1.2.0 update. Sample code results from the original issue: With brotli 1.1.0: With brotli 1.2.0:”
@kesara · 2025-12-09 · source

Failure Signature (Search String)

  • urllib3.exceptions.DecodeError: ('Received response with content-encoding: br, but

Error Message

Stack trace
error.txt
Error Message ------------- urllib3.exceptions.DecodeError: ('Received response with content-encoding: br, but failed to decode it.', error("brotli: decoder process called with data when 'can_accept_more_data()' is False"))
Stack trace
error.txt
Error Message ------------- urllib3: 2.6.1, brotli: 1.2.0 Data: 14,565,000 -> 549,724 bytes (26x) Exception in thread Thread-1 (serve): FAILED: ('Received response with content-encoding: br, but failed to decode it.', error("brotli: decoder process called with data when 'can_accept_more_data()' is False")) Traceback (most recent call last): File "/opt/homebrew/Cellar/[email protected]/3.12.12/Frameworks/Python.framework/Versions/3.12/lib/python3.12/threading.py", line 1075, in _bootstrap_inner

Minimal Reproduction

repro.py
#!/usr/bin/env python3 """Minimal reproduction: brotli decode bug with urllib3 2.6.x + brotli 1.2.0""" import hashlib import socket import threading import brotli import requests def main() -> int: from importlib.metadata import version print(f"urllib3: {version('urllib3')}, brotli: {version('brotli')}") # Generate ~15MB data with moderate compressibility (~27x ratio) data = b"".join( f"{hashlib.sha256(str(i).encode()).hexdigest()}{'a' * 900}{i:06d}\n".encode() for i in range(15000) ) compressed = brotli.compress(data) print(f"Data: {len(data):,} -> {len(compressed):,} bytes ({len(data) // len(compressed)}x)") # Build chunked HTTP response resp = b"HTTP/1.1 200 OK\r\nContent-Encoding: br\r\nTransfer-Encoding: chunked\r\n\r\n" for i in range(0, len(compressed), 32768): chunk = compressed[i : i + 32768] resp += f"{len(chunk):x}\r\n".encode() + chunk + b"\r\n" resp += b"0\r\n\r\n" # Start mock server ready = threading.Event() def serve(port: int) -> None: s = socket.socket() s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) s.bind(("127.0.0.1", port)) s.listen(1) ready.set() c, _ = s.accept() c.recv(4096) for i in range(0, len(resp), 128): # Small chunks trigger bug c.send(resp[i : i + 128]) c.close() s.close() threading.Thread(target=serve, args=(18765,), daemon=True).start() ready.wait() try: r = requests.get("http://127.0.0.1:18765/", timeout=60) print(f"SUCCESS: {len(r.content):,} bytes") return 0 except requests.exceptions.ContentDecodingError as e: print(f"FAILED: {e}") return 1 if __name__ == "__main__": exit(main())

Environment

  • Python: 3.12.12
  • urllib3: 2.6

What Broke

Responses with large brotli-compressed data fail to decode, causing application errors.

Fix Options (Details)

Option A — Upgrade to fixed release Safe default (recommended)

pip install urllib3==2.6.2

When NOT to use: This fix should not be used if the application relies on the previous behavior of decoding.

Use when you can deploy the upstream fix. It is usually lower-risk than long-lived workarounds.

Option D — Guard side-effects with OnceOnly Guardrail for side-effects

Mitigate duplicate external side-effects under retries/timeouts/agent loops by gating the operation before calling external systems.

  • Place OnceOnly between your code/agent and real side-effects (Stripe, emails, CRM, APIs).
  • Use a stable key per side-effect (e.g., customer_id + action + idempotency_key).
  • Fail-safe: configure fail-open vs fail-closed based on blast radius and spend risk.
  • This does NOT fix data corruption; it only prevents duplicate side-effects.
Show example snippet (optional)
onceonly.py
from onceonly import OnceOnly import os once = OnceOnly(api_key=os.environ["ONCEONLY_API_KEY"], fail_open=True) # Stable idempotency key per real side-effect. # Use a request id / job id / webhook delivery id / Stripe event id, etc. event_id = "evt_..." # replace key = f"stripe:webhook:{event_id}" res = once.check_lock(key=key, ttl=3600) if res.duplicate: return {"status": "already_processed"} # Safe to execute the side-effect exactly once. handle_event(event_id)

See OnceOnly SDK

When NOT to use: Do not use this to hide logic bugs or data corruption. Use it to block duplicate external side-effects and enforce tool permissions/spend caps.

Fix reference: https://github.com/urllib3/urllib3/pull/3736

First fixed release: 2.6.2

Last verified: 2026-02-09. Validate in your environment.

Get updates

We publish verified fixes weekly. No spam.

Subscribe

When NOT to Use This Fix

  • This fix should not be used if the application relies on the previous behavior of decoding.
  • Do not use this to hide logic bugs or data corruption. Use it to block duplicate external side-effects and enforce tool permissions/spend caps.

Verify Fix

verify
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.

Did This Fix Work in Your Case?

Quick signal helps us prioritize which fixes to verify and improve.

Prevention

  • Add a CI check that diffs key outputs after upgrades (OpenAPI schema snapshots, JSON payload shapes, CLI output).
  • Upgrade behind a canary and run integration tests against the canary before 100% rollout.
  • Add a TLS smoke test that performs a real handshake in CI (include CA bundle validation and hostname checks).
  • Alert on handshake failures by error string and endpoint to catch cert/CA changes quickly.

Version Compatibility Table

VersionStatus
2.0.0 Broken
2.6.2 Fixed

Related Issues

No related fixes found.

Sources

We don’t republish the full GitHub discussion text. Use the links above for context.