The Fix
Upgrade to version 0.13.0 or later.
Based on closed encode/httpx issue #413 · PR/commit linked
Production note: Watch p95/p99 latency and retry volume; timeouts can turn into retry storms and duplicate side-effects.
@@ -211,6 +211,67 @@ def _detector_result(self) -> str:
+class LineDecoder:
+ """
+ Handles incrementally reading lines from text.
import json
import httpx
class StreamWrapper(object):
def __init__(self, stream):
self._stream = stream
def __iter__(self):
pending = ""
for chunk in self._stream:
chunk = pending + chunk
lines = chunk.splitlines()
if chunk and lines and lines[-1] and lines[-1][-1] == chunk[-1]:
pending = lines.pop()
else:
pending = ""
for line in lines:
yield line
if pending:
yield pending
timeout = httpx.TimeoutConfig(
connect_timeout=5, read_timeout=None, write_timeout=5
)
resp = httpx.get(
"http://127.0.0.1:18081/api/v1/watch/namespaces/default/pods",
stream=True,
timeout=timeout,
)
for chunk in StreamWrapper(resp.stream_text()):
print(json.loads(chunk))
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Option A — Upgrade to fixed release\nUpgrade to version 0.13.0 or later.\nWhen NOT to use: This fix is not suitable for synchronous stream processing without proper handling.\n\n
Why This Fix Works in Production
- Trigger: timeout = httpx.TimeoutConfig(
- Mechanism: The existing API did not support iterating streams line by line, requiring a wrapper implementation
- Why the fix works: Implemented Response.stream_lines() to support iterating stream line by line, addressing the feature request in issue #413. (first fixed release: 0.13.0).
- If left unfixed, tail latency can spike under load and surface as timeouts/retries (amplifying incident impact).
Why This Breaks in Prod
- The existing API did not support iterating streams line by line, requiring a wrapper implementation
- Production symptom (often without a traceback): timeout = httpx.TimeoutConfig(
Proof / Evidence
- GitHub issue: #413
- Fix PR: https://github.com/encode/httpx/pull/575
- First fixed release: 0.13.0
- Reproduced locally: No (not executed)
- Last verified: 2026-02-09
- Confidence: 0.85
- Did this fix it?: Yes (upstream fix exists)
- Own content ratio: 0.55
Discussion
High-signal excerpts from the issue thread (symptoms, repros, edge-cases).
“Related to #24, #183”
“Thanks for sharing this snippet @Hanaasagi. I think you’ll understand though that this particular piece of functionality is out of scope for HTTPX, as supported…”
“Woops, clicked the wrong button”
“(For your immediate need, is it an option to use the official Kubernetes Python client?) Note that your wrapping algorithm is accidentally quadratic”
Failure Signature (Search String)
- timeout = httpx.TimeoutConfig(
- connect_timeout=5, read_timeout=None, write_timeout=5
Copy-friendly signature
Failure Signature
-----------------
timeout = httpx.TimeoutConfig(
connect_timeout=5, read_timeout=None, write_timeout=5
Error Message
Signature-only (no traceback captured)
Error Message
-------------
timeout = httpx.TimeoutConfig(
connect_timeout=5, read_timeout=None, write_timeout=5
Minimal Reproduction
import json
import httpx
class StreamWrapper(object):
def __init__(self, stream):
self._stream = stream
def __iter__(self):
pending = ""
for chunk in self._stream:
chunk = pending + chunk
lines = chunk.splitlines()
if chunk and lines and lines[-1] and lines[-1][-1] == chunk[-1]:
pending = lines.pop()
else:
pending = ""
for line in lines:
yield line
if pending:
yield pending
timeout = httpx.TimeoutConfig(
connect_timeout=5, read_timeout=None, write_timeout=5
)
resp = httpx.get(
"http://127.0.0.1:18081/api/v1/watch/namespaces/default/pods",
stream=True,
timeout=timeout,
)
for chunk in StreamWrapper(resp.stream_text()):
print(json.loads(chunk))
What Broke
Users needed to create custom wrappers for line-by-line stream processing, leading to increased complexity.
Why It Broke
The existing API did not support iterating streams line by line, requiring a wrapper implementation
Fix Options (Details)
Option A — Upgrade to fixed release Safe default (recommended)
Upgrade to version 0.13.0 or later.
Use when you can deploy the upstream fix. It is usually lower-risk than long-lived workarounds.
Fix reference: https://github.com/encode/httpx/pull/575
First fixed release: 0.13.0
Last verified: 2026-02-09. Validate in your environment.
When NOT to Use This Fix
- This fix is not suitable for synchronous stream processing without proper handling.
Verify Fix
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Did This Fix Work in Your Case?
Quick signal helps us prioritize which fixes to verify and improve.
Prevention
- Add a CI check that diffs key outputs after upgrades (OpenAPI schema snapshots, JSON payload shapes, CLI output).
- Upgrade behind a canary and run integration tests against the canary before 100% rollout.
- Make timeouts explicit and test them (unit + integration) to avoid silent behavior changes.
- Instrument retries (attempt count + reason) and alert on spikes to catch dependency slowdowns.
Version Compatibility Table
| Version | Status |
|---|---|
| 0.13.0 | Fixed |
Related Issues
No related fixes found.
Sources
We don’t republish the full GitHub discussion text. Use the links above for context.