The Fix
pip install celery==5.3.6
Based on closed celery/celery issue #8625 · PR/commit linked
Production note: This usually shows up under retries/timeouts. Treat it as a side-effect risk until you can verify behavior with a canary + real traffic.
@@ -104,7 +104,7 @@ def __init__(self, *args, **kwargs):
headers = {}
headers.update(*args, **kwargs)
- celery_keys = {*Context.__dict__.keys(), 'lang', 'task', 'argsrepr', 'kwargsrepr'}
+ celery_keys = {*Context.__dict__.keys(), 'lang', 'task', 'argsrepr', 'kwargsrepr', 'compression'}
for key in celery_keys:
from celery import Celery
app = Celery()
@app.task(autoretry_for=(Exception,), max_retries=1, retry_backoff=1)
def task():
print("Executing task")
raise Exception("whoopsie")
if __name__ == "__main__":
app.send_task(name="main.task", compression="zlib")
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Option A — Upgrade to fixed release\npip install celery==5.3.6\nWhen NOT to use: Do not apply this fix if the compression behavior is intentionally disabled for specific tasks.\n\n
Why This Fix Works in Production
- Trigger: [2023-11-08 12:23:26,377: CRITICAL/MainProcess] Can't decode message body: TypeError("a bytes-like object is required, not 'str'") [type:'application/json'…
- Mechanism: The task is requeued with a compression header, but the body remains uncompressed due to a missing key
- Why the fix works: Updates the task.py file to include the missing 'compression' key in the custom headers, addressing the issue where the task is requeued with a compression header but the body is not compressed. (first fixed release: 5.3.6).
- If left unfixed, this can cause silent data inconsistencies that propagate (bad cache entries, incorrect downstream decisions).
Why This Breaks in Prod
- Shows up under Python 3.11 in real deployments (not just unit tests).
- The task is requeued with a compression header, but the body remains uncompressed due to a missing key
- Surfaces as: [2023-11-08 12:23:26,377: CRITICAL/MainProcess] Can't decode message body: TypeError("a bytes-like object is required, not 'str'") [type:'application/json' encoding:'utf-8'…
Proof / Evidence
- GitHub issue: #8625
- Fix PR: https://github.com/celery/celery/pull/8633
- First fixed release: 5.3.6
- Reproduced locally: No (not executed)
- Last verified: 2026-02-09
- Confidence: 0.95
- Did this fix it?: Yes (upstream fix exists)
- Own content ratio: 0.44
Discussion
High-signal excerpts from the issue thread (symptoms, repros, edge-cases).
“Did you check the changelog/release notes to see if it is a expected change in v5.3.0?”
“It's probably related to #7555. The compression header is now sent when retrying whereas it wasn't before. But the payload is only compressed if task_compression…”
“Reviewing the PR, now it seems that it might need more work. Can you at least try to give it a shot?”
“I can confirm that reverting PR #7555 fixes the bug. The following patch also fixes the bug:”
Failure Signature (Search String)
- [2023-11-08 12:23:26,377: CRITICAL/MainProcess] Can't decode message body: TypeError("a bytes-like object is required, not 'str'") [type:'application/json' encoding:'utf-8'
Error Message
Stack trace
Error Message
-------------
[2023-11-08 12:23:26,377: CRITICAL/MainProcess] Can't decode message body: TypeError("a bytes-like object is required, not 'str'") [type:'application/json' encoding:'utf-8' headers:{'lang': 'py', 'task': 'main.task', 'id': '6923bf1e-5008-4bee-9619-d8ac9ce9397a', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 'retries': 1, 'timelimit': [None, None], 'root_id': '6923bf1e-5008-4bee-9619-d8ac9ce9397a', 'parent_id': '6923bf1e-5008-4bee-9619-d8ac9ce9397a', 'argsrepr': '()', 'kwargsrepr': '{}', 'origin': 'gen2321095@xenomorph', 'ignore_result': False, 'stamped_headers': None, 'stamps': {}, 'compression': 'application/x-gzip'}]
body: '[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (77b)
Traceback (most recent call last):
File "/home/mathieu/.cache/pypoetry/virtualenvs/celery-compresion-issue-jmrSpK9z-py3.11/lib/python3.11/site-packages/kombu/message.py", line 95, in _reraise_error
reraise(*self.errors[0])
File "/home/mathieu/.cache/pypoetry/virtualenvs/celery-compresion-issue-jmrSpK9z-py3.11/lib/python3.11/site-packages/kombu/exceptions.py", line 35, in reraise
raise value
File "/home/mathieu/.cache/pypoetry/virtualenvs/celery-compresion-issue-jmrSpK9z-py3.11/lib/python3.11/site-packages/kombu/message.py", line 82, in __init__
body = decompress(body, compression)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
... (truncated) ...
Minimal Reproduction
from celery import Celery
app = Celery()
@app.task(autoretry_for=(Exception,), max_retries=1, retry_backoff=1)
def task():
print("Executing task")
raise Exception("whoopsie")
if __name__ == "__main__":
app.send_task(name="main.task", compression="zlib")
Environment
- Python: 3.11
What Broke
Tasks are requeued without compression, leading to increased payload sizes and potential performance issues.
Why It Broke
The task is requeued with a compression header, but the body remains uncompressed due to a missing key
Fix Options (Details)
Option A — Upgrade to fixed release Safe default (recommended)
pip install celery==5.3.6
Use when you can deploy the upstream fix. It is usually lower-risk than long-lived workarounds.
Option D — Guard side-effects with OnceOnly Guardrail for side-effects
Mitigate duplicate external side-effects under retries/timeouts/agent loops by gating the operation before calling external systems.
- Place OnceOnly between your code/agent and real side-effects (Stripe, emails, CRM, APIs).
- Use a stable key per side-effect (e.g., customer_id + action + idempotency_key).
- Fail-safe: configure fail-open vs fail-closed based on blast radius and spend risk.
- This does NOT fix data corruption; it only prevents duplicate side-effects.
Show example snippet (optional)
from onceonly import OnceOnly
import os
once = OnceOnly(api_key=os.environ["ONCEONLY_API_KEY"], fail_open=True)
# Stable idempotency key per real side-effect.
# Use a request id / job id / webhook delivery id / Stripe event id, etc.
event_id = "evt_..." # replace
key = f"stripe:webhook:{event_id}"
res = once.check_lock(key=key, ttl=3600)
if res.duplicate:
return {"status": "already_processed"}
# Safe to execute the side-effect exactly once.
handle_event(event_id)
Fix reference: https://github.com/celery/celery/pull/8633
First fixed release: 5.3.6
Last verified: 2026-02-09. Validate in your environment.
When NOT to Use This Fix
- Do not apply this fix if the compression behavior is intentionally disabled for specific tasks.
- Do not use this to hide logic bugs or data corruption. Use it to block duplicate external side-effects and enforce tool permissions/spend caps.
Verify Fix
Re-run the minimal reproduction on your broken version, then apply the fix and re-run.
Did This Fix Work in Your Case?
Quick signal helps us prioritize which fixes to verify and improve.
Prevention
- Capture the exact failing error string in logs and tests so you can reproduce via a minimal script.
- Pin production dependencies and upgrade only with a reproducible test that hits the failing path.
Version Compatibility Table
| Version | Status |
|---|---|
| 5.3.6 | Fixed |
Related Issues
No related fixes found.
Sources
We don’t republish the full GitHub discussion text. Use the links above for context.