The Fix
pip install urllib3==2.0.2
Based on closed urllib3/urllib3 issue #3009 · PR/commit linked
Production note: Watch p95/p99 latency and retry volume; timeouts can turn into retry storms and duplicate side-effects.
@@ -0,0 +1,3 @@
@@ -0,0 +1,3 @@
+Fixed ``HTTPResponse.stream()`` to continue yielding bytes if buffered decompressed data
+was still available to be read even if the underlying socket is closed. This prevents
+a compressed response from being truncated.
import urllib3
url = "https://raw.githubusercontent.com/autonlab/auton-survival/cf583e598ec9ab92fa5d510a0ca72d46dfe0706f/dsm/datasets/pbc2.csv"
http = urllib3.PoolManager(num_pools=1)
conn = http.connection_from_url(url)
resp = conn.urlopen(
'GET',
url,
headers={'Accept-Encoding': 'gzip'},
decode_content=False,
preload_content=False,
)
data = b"".join(i for i in resp.stream(1024*10, decode_content=True))
assert len(data) == 296085, f"Data length was: {len(data)}"
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Option A — Upgrade to fixed release\npip install urllib3==2.0.2\nWhen NOT to use: This fix should not be applied if the application relies on the socket being closed for response completion.\n\n
Why This Fix Works in Production
- Trigger: Submitting also here, as this appears to be a regression issue when upgrading to `urllib3 2.0`.
- Mechanism: Fixed HTTPResponse.stream() to continue yielding bytes if buffered decompressed data was still available to be read even if the underlying socket is closed.
- Why the fix works: Fixed HTTPResponse.stream() to continue yielding bytes if buffered decompressed data was still available to be read even if the underlying socket is closed. (first fixed release: 2.0.2).
- If left unfixed, this can cause silent data inconsistencies that propagate (bad cache entries, incorrect downstream decisions).
Why This Breaks in Prod
- Shows up under Python 3.7.16 in real deployments (not just unit tests).
- Production symptom (often without a traceback): Submitting also here, as this appears to be a regression issue when upgrading to `urllib3 2.0`.
Proof / Evidence
- GitHub issue: #3009
- Fix PR: https://github.com/urllib3/urllib3/pull/3012
- First fixed release: 2.0.2
- Reproduced locally: No (not executed)
- Last verified: 2026-02-09
- Confidence: 0.95
- Did this fix it?: Yes (upstream fix exists)
- Own content ratio: 0.64
Verified Execution
We executed the runnable minimal repro in a temporary environment and captured exit codes + logs.
- Status: PASS
- Ran: 2026-02-11T16:52:29Z
- Package: urllib3
- Fixed: 2.0.2
- Mode: fixed_only
- Outcome: ok
Logs
Discussion
High-signal excerpts from the issue thread (symptoms, repros, edge-cases).
“urllib3 2.0.2 has been released with the fix for this issue: https://github.com/urllib3/urllib3/releases/tag/2.0.2”
“This is the minimal repro I was able to produce without Requests: The issue is specifically with streaming gzipped responses. Removing the Accept-Encoding header does…”
“Having bisected for the above reproducer, I've found this to be the bad commit: c35033f6cc54106ca66ef2d48a9e3564d4fb0e07”
Failure Signature (Search String)
- Submitting also here, as this appears to be a regression issue when upgrading to `urllib3 2.0`.
- "pyOpenSSL": {
Copy-friendly signature
Failure Signature
-----------------
Submitting also here, as this appears to be a regression issue when upgrading to `urllib3 2.0`.
"pyOpenSSL": {
Error Message
Signature-only (no traceback captured)
Error Message
-------------
Submitting also here, as this appears to be a regression issue when upgrading to `urllib3 2.0`.
"pyOpenSSL": {
Minimal Reproduction
import urllib3
url = "https://raw.githubusercontent.com/autonlab/auton-survival/cf583e598ec9ab92fa5d510a0ca72d46dfe0706f/dsm/datasets/pbc2.csv"
http = urllib3.PoolManager(num_pools=1)
conn = http.connection_from_url(url)
resp = conn.urlopen(
'GET',
url,
headers={'Accept-Encoding': 'gzip'},
decode_content=False,
preload_content=False,
)
data = b"".join(i for i in resp.stream(1024*10, decode_content=True))
assert len(data) == 296085, f"Data length was: {len(data)}"
Environment
- Python: 3.7.16
- urllib3: 2
What Broke
Responses were truncated when streaming gzipped content, leading to incomplete data retrieval.
Fix Options (Details)
Option A — Upgrade to fixed release Safe default (recommended)
pip install urllib3==2.0.2
Use when you can deploy the upstream fix. It is usually lower-risk than long-lived workarounds.
Fix reference: https://github.com/urllib3/urllib3/pull/3012
First fixed release: 2.0.2
Last verified: 2026-02-09. Validate in your environment.
When NOT to Use This Fix
- This fix should not be applied if the application relies on the socket being closed for response completion.
Verify Fix
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Did This Fix Work in Your Case?
Quick signal helps us prioritize which fixes to verify and improve.
Prevention
- Add a CI check that diffs key outputs after upgrades (OpenAPI schema snapshots, JSON payload shapes, CLI output).
- Upgrade behind a canary and run integration tests against the canary before 100% rollout.
- Add a TLS smoke test that performs a real handshake in CI (include CA bundle validation and hostname checks).
- Alert on handshake failures by error string and endpoint to catch cert/CA changes quickly.
Version Compatibility Table
| Version | Status |
|---|---|
| 2.0.2 | Fixed |
Related Issues
No related fixes found.
Sources
We don’t republish the full GitHub discussion text. Use the links above for context.