Jump to solution
Verify

The Fix

pip install redis==7.1.0

Based on closed redis/redis-py issue #2665 · PR/commit linked

Production note: This usually shows up under retries/timeouts. Treat it as a side-effect risk until you can verify behavior with a canary + real traffic.

Jump to Verify Open PR/Commit
@@ -8,7 +8,7 @@ keywords=["Redis", "key-value store", "database"], license="MIT", - version="4.5.3", + version="4.5.4", packages=find_packages(
repro.py
import asyncio from redis.asyncio import Redis async def pipe(reader: asyncio.StreamReader, writer: asyncio.StreamWriter, delay: float, name=''): while data := await reader.read(1000): # print(name, 'received:', data) await asyncio.sleep(delay) writer.write(data) await writer.drain() class DelayProxy: def __init__(self, addr, redis_addr, delay: float): self.addr = addr self.redis_addr = redis_addr self.delay = delay async def start(self): server = await asyncio.start_server(self.handle, *self.addr) asyncio.create_task(server.serve_forever()) async def handle(self, reader, writer): # establish connection to redis redis_reader, redis_writer = await asyncio.open_connection(*self.redis_addr) pipe1 = asyncio.create_task(pipe(reader, redis_writer, self.delay, 'to redis:')) pipe2 = asyncio.create_task(pipe(redis_reader, writer, self.delay, 'from redis:')) await asyncio.gather(pipe1, pipe2) async def main(): # create a tcp socket proxy that relays data to Redis and back, inserting 0.1 seconds of delay dp = DelayProxy(addr=('localhost', 6380), redis_addr=('localhost', 6379), delay=0.1) await dp.start() # note that we connect to proxy, rather than to Redis directly async with Redis(host='localhost', port=6380) as r: await r.set('foo', 'foo') await r.set('bar', 'bar') t = asyncio.create_task(r.get('foo')) await asyncio.sleep(0.050) t.cancel() try: await t print('try again, we did not cancel the task in time') except asyncio.CancelledError: print('managed to cancel the task, connection is left open with unread response') print('bar:', await r.get('bar')) print('ping:', await r.ping()) print('foo:', await r.get('foo')) if __name__ == '__main__': asyncio.run(main())
verify
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
fix.md
Option A — Upgrade to fixed release\npip install redis==7.1.0\nWhen NOT to use: Do not use if it changes public behavior or if the failure cannot be reproduced.\n\n

Why This Fix Works in Production

  • Trigger: Task exception was never retrieved
  • Mechanism: Canceling an async Redis command does not properly close the connection, leaving it in an unsafe state
  • Why the fix works: Updates the version from 4.5.3 to 4.5.4, addressing the issue of canceling async Redis commands. (first fixed release: 7.1.0).
Production impact:
  • If left unfixed, this can cause silent data inconsistencies that propagate (bad cache entries, incorrect downstream decisions).

Why This Breaks in Prod

  • Shows up under Python 3.8 in real deployments (not just unit tests).
  • Canceling an async Redis command does not properly close the connection, leaving it in an unsafe state
  • Surfaces as: Task exception was never retrieved

Proof / Evidence

  • GitHub issue: #2665
  • Fix PR: https://github.com/redis/redis-py/pull/2674
  • First fixed release: 7.1.0
  • Reproduced locally: No (not executed)
  • Last verified: 2026-02-09
  • Confidence: 0.95
  • Did this fix it?: Yes (upstream fix exists)
  • Own content ratio: 0.29

Discussion

High-signal excerpts from the issue thread (symptoms, repros, edge-cases).

“It'll be backport to 4.3 and 4.4. @dvora-h can you include this issues # in each of the branch tickets that are about to be…”
@chayim · 2023-03-29 · confirmation · source
“lease ping me when released as I have to test them as well”
@auvipy · 2023-03-29 · confirmation · source
“I agree - it's why we killed this off in 4.4 - and announced early on that 4.3.x would be the last version that supported…”
@chayim · 2023-03-29 · source
“Is this the same issue I'm encountering here, or is mine different? Result: That a task's exception was never retrieved speaks of a broader problem.”
@agronholm · 2023-04-19 · source

Failure Signature (Search String)

  • Task exception was never retrieved

Error Message

Stack trace
error.txt
Error Message ------------- Task exception was never retrieved future: <Task finished name='Task-3' coro=<Redis._try_send_command_parse_response() done, defined at /usr/local/lib/python3.11/site-packages/redis/asyncio/client.py:503> exception=ConnectionError('Connection closed by server.')> Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/redis/asyncio/client.py", line 505, in _try_send_command_parse_response return await conn.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/redis/asyncio/retry.py", line 62, in call_with_retry await fail(error) File "/usr/local/lib/python3.11/site-packages/redis/asyncio/client.py", line 501, in _disconnect_raise raise error File "/usr/local/lib/python3.11/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry return await do() ^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/redis/asyncio/client.py", line 488, in _send_command_parse_response return await self.parse_response(conn, command_name, **options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/redis/asyncio/client.py", line 544, in parse_response response = await connection.read_response() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/redi ... (truncated) ...

Minimal Reproduction

repro.py
import asyncio from redis.asyncio import Redis async def pipe(reader: asyncio.StreamReader, writer: asyncio.StreamWriter, delay: float, name=''): while data := await reader.read(1000): # print(name, 'received:', data) await asyncio.sleep(delay) writer.write(data) await writer.drain() class DelayProxy: def __init__(self, addr, redis_addr, delay: float): self.addr = addr self.redis_addr = redis_addr self.delay = delay async def start(self): server = await asyncio.start_server(self.handle, *self.addr) asyncio.create_task(server.serve_forever()) async def handle(self, reader, writer): # establish connection to redis redis_reader, redis_writer = await asyncio.open_connection(*self.redis_addr) pipe1 = asyncio.create_task(pipe(reader, redis_writer, self.delay, 'to redis:')) pipe2 = asyncio.create_task(pipe(redis_reader, writer, self.delay, 'from redis:')) await asyncio.gather(pipe1, pipe2) async def main(): # create a tcp socket proxy that relays data to Redis and back, inserting 0.1 seconds of delay dp = DelayProxy(addr=('localhost', 6380), redis_addr=('localhost', 6379), delay=0.1) await dp.start() # note that we connect to proxy, rather than to Redis directly async with Redis(host='localhost', port=6380) as r: await r.set('foo', 'foo') await r.set('bar', 'bar') t = asyncio.create_task(r.get('foo')) await asyncio.sleep(0.050) t.cancel() try: await t print('try again, we did not cancel the task in time') except asyncio.CancelledError: print('managed to cancel the task, connection is left open with unread response') print('bar:', await r.get('bar')) print('ping:', await r.ping()) print('foo:', await r.get('foo')) if __name__ == '__main__': asyncio.run(main())

Environment

  • Python: 3.8

What Broke

Connections remain open and can lead to unexpected behavior or resource leaks.

Why It Broke

Canceling an async Redis command does not properly close the connection, leaving it in an unsafe state

Fix Options (Details)

Option A — Upgrade to fixed release Safe default (recommended)

pip install redis==7.1.0

When NOT to use: Do not use if it changes public behavior or if the failure cannot be reproduced.

Use when you can deploy the upstream fix. It is usually lower-risk than long-lived workarounds.

Option D — Guard side-effects with OnceOnly Guardrail for side-effects

Mitigate duplicate external side-effects under retries/timeouts/agent loops by gating the operation before calling external systems.

  • Place OnceOnly between your code/agent and real side-effects (Stripe, emails, CRM, APIs).
  • Use a stable key per side-effect (e.g., customer_id + action + idempotency_key).
  • Fail-safe: configure fail-open vs fail-closed based on blast radius and spend risk.
  • This does NOT fix data corruption; it only prevents duplicate side-effects.
Show example snippet (optional)
onceonly.py
from onceonly import OnceOnly import os once = OnceOnly(api_key=os.environ["ONCEONLY_API_KEY"], fail_open=True) # Stable idempotency key per real side-effect. # Use a request id / job id / webhook delivery id / Stripe event id, etc. event_id = "evt_..." # replace key = f"stripe:webhook:{event_id}" res = once.check_lock(key=key, ttl=3600) if res.duplicate: return {"status": "already_processed"} # Safe to execute the side-effect exactly once. handle_event(event_id)

See OnceOnly SDK

When NOT to use: Do not use this to hide logic bugs or data corruption. Use it to block duplicate external side-effects and enforce tool permissions/spend caps.

Fix reference: https://github.com/redis/redis-py/pull/2674

First fixed release: 7.1.0

Last verified: 2026-02-09. Validate in your environment.

Get updates

We publish verified fixes weekly. No spam.

Subscribe

When NOT to Use This Fix

  • Do not use if it changes public behavior or if the failure cannot be reproduced.
  • Do not use this to hide logic bugs or data corruption. Use it to block duplicate external side-effects and enforce tool permissions/spend caps.

Verify Fix

verify
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.

Did This Fix Work in Your Case?

Quick signal helps us prioritize which fixes to verify and improve.

Prevention

  • Capture the exact failing error string in logs and tests so you can reproduce via a minimal script.
  • Pin production dependencies and upgrade only with a reproducible test that hits the failing path.

Version Compatibility Table

VersionStatus
7.1.0 Fixed

Related Issues

No related fixes found.

Sources

We don’t republish the full GitHub discussion text. Use the links above for context.