The Fix
pip install celery==5.3.0a1
Based on closed celery/celery issue #9174 · PR/commit linked
Production note: This usually shows up under retries/timeouts. Treat it as a side-effect risk until you can verify behavior with a canary + real traffic.
@@ -286,4 +286,5 @@ Patrick Zhang, 2017/08/19
kronion, 2021/08/26
Gabor Boros, 2021/11/09
-Tizian Seehaus, 2022/02/09
\ No newline at end of file
+Tizian Seehaus, 2022/02/09
@CELERY_APP.task(base=BaseCurrentUserTask, bind=True, name="test_current_user", retry_kwargs={"max_retries": 2})
def test_current_user(self):
from time import sleep
try:
if self.request.retries < self.max_retries:
sleep(2)
raise Exception()
else:
return "Last retry, no exception raised"
except Exception as e:
raise self.retry(exc=e)
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Option A — Upgrade to fixed release\npip install celery==5.3.0a1\nWhen NOT to use: This fix is not applicable if the task does not use custom headers.\n\n
Why This Fix Works in Production
- Trigger: [2024-08-05 14:49:52,482: WARNING/ForkPoolWorker-7] Task test_current_user[606303f0-16fc-4eff-9dea-8355f503ed8f] reject requeue=False: 'headers'
- Mechanism: Custom headers were not properly propagated during task retries in Celery
- Why the fix works: Fixes the issue where custom headers were not properly propagated during task retries in Celery. (first fixed release: 5.3.0a1).
- If left unfixed, this can cause silent data inconsistencies that propagate (bad cache entries, incorrect downstream decisions).
Why This Breaks in Prod
- Shows up under Python 3.9 in real deployments (not just unit tests).
- Custom headers were not properly propagated during task retries in Celery
- Surfaces as: [2024-08-05 14:49:52,482: WARNING/ForkPoolWorker-7] Task test_current_user[606303f0-16fc-4eff-9dea-8355f503ed8f] reject requeue=False: 'headers'
Proof / Evidence
- GitHub issue: #9174
- Fix PR: https://github.com/celery/celery/pull/7555
- First fixed release: 5.3.0a1
- Reproduced locally: No (not executed)
- Last verified: 2026-02-09
- Confidence: 0.75
- Did this fix it?: Yes (upstream fix exists)
- Own content ratio: 0.38
Discussion
High-signal excerpts from the issue thread (symptoms, repros, edge-cases).
“well I made silly mistake forgot include the headers argument in the base class apply_async method, adding it sovled my issue.”
“I have observed some inconsistencies in the documentation for the apply_async method of the Celery Task class”
Failure Signature (Search String)
- [2024-08-05 14:49:52,482: WARNING/ForkPoolWorker-7] Task test_current_user[606303f0-16fc-4eff-9dea-8355f503ed8f] reject requeue=False: 'headers'
Error Message
Stack trace
Error Message
-------------
[2024-08-05 14:49:52,482: WARNING/ForkPoolWorker-7] Task test_current_user[606303f0-16fc-4eff-9dea-8355f503ed8f] reject requeue=False: 'headers'
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/celery/app/autoretry.py", line 38, in run
return task._orig_run(*args, **kwargs)
File "/code/projects/tasks.py", line 71, in test_current_user
raise Exception()
Exception
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/celery/app/task.py", line 753, in retry
S.apply_async()
File "/usr/local/lib/python3.9/site-packages/celery/canvas.py", line 400, in apply_async
return _apply(args, kwargs, **options)
File "/code/core/celery.py", line 324, in apply_async
super().apply_async(args, kwargs, task_id, producer, link, link_error, shadow, headers=headers, **options)
KeyError: 'headers'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/celery/app/trace.py", line 453, in trace_task
R = retval = fun(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/sentry_sdk/integrations/celery.py", line 306, in _inner
reraise(*exc_info)
File "/usr/local/lib/python3.9/site-packages/sentry_sdk/_compat.py", line 127, in reraise
r
... (truncated) ...
Minimal Reproduction
@CELERY_APP.task(base=BaseCurrentUserTask, bind=True, name="test_current_user", retry_kwargs={"max_retries": 2})
def test_current_user(self):
from time import sleep
try:
if self.request.retries < self.max_retries:
sleep(2)
raise Exception()
else:
return "Last retry, no exception raised"
except Exception as e:
raise self.retry(exc=e)
Environment
- Python: 3.9
What Broke
Tasks failed with KeyError when custom headers were set and retried.
Why It Broke
Custom headers were not properly propagated during task retries in Celery
Fix Options (Details)
Option A — Upgrade to fixed release Safe default (recommended)
pip install celery==5.3.0a1
Use when you can deploy the upstream fix. It is usually lower-risk than long-lived workarounds.
Option D — Guard side-effects with OnceOnly Guardrail for side-effects
Mitigate duplicate external side-effects under retries/timeouts/agent loops by gating the operation before calling external systems.
- Place OnceOnly between your code/agent and real side-effects (Stripe, emails, CRM, APIs).
- Use a stable key per side-effect (e.g., customer_id + action + idempotency_key).
- Fail-safe: configure fail-open vs fail-closed based on blast radius and spend risk.
- This does NOT fix data corruption; it only prevents duplicate side-effects.
Show example snippet (optional)
from onceonly import OnceOnly
import os
once = OnceOnly(api_key=os.environ["ONCEONLY_API_KEY"], fail_open=True)
# Stable idempotency key per real side-effect.
# Use a request id / job id / webhook delivery id / Stripe event id, etc.
event_id = "evt_..." # replace
key = f"stripe:webhook:{event_id}"
res = once.check_lock(key=key, ttl=3600)
if res.duplicate:
return {"status": "already_processed"}
# Safe to execute the side-effect exactly once.
handle_event(event_id)
Fix reference: https://github.com/celery/celery/pull/7555
First fixed release: 5.3.0a1
Last verified: 2026-02-09. Validate in your environment.
When NOT to Use This Fix
- This fix is not applicable if the task does not use custom headers.
- Do not use this to hide logic bugs or data corruption. Use it to block duplicate external side-effects and enforce tool permissions/spend caps.
Verify Fix
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Did This Fix Work in Your Case?
Quick signal helps us prioritize which fixes to verify and improve.
Prevention
- Capture the exact failing error string in logs and tests so you can reproduce via a minimal script.
- Pin production dependencies and upgrade only with a reproducible test that hits the failing path.
Version Compatibility Table
| Version | Status |
|---|---|
| 5.3.0a1 | Fixed |
Related Issues
No related fixes found.
Sources
We don’t republish the full GitHub discussion text. Use the links above for context.