Queues
Asynchronous background jobs with named queue bindings
Overview
Add asynchronous background job processing to your project backend with named queue bindings.
Queues are perfect for:
- Background processing after API responses
- Smoothing traffic spikes with batch processing
- Retryable workflows with dead-letter queues
- Long-running jobs (AI enrichment, media processing, etc.)
Getting Started
$ playcademy init # Select "Yes" for Queues$ playcademy queue initThis prompts for a queue key (e.g. enrichment) and scaffolds:
Run playcademy queue init again to add more queues. Each queue gets its own handler file and config entry.
Custom Routes Required
Queues require the Custom Routes integration to be enabled, since you send messages from API routes via c.env.
Configuration
Queue Bindings
Each key in integrations.queues creates a typed binding on c.env:
| Queue Key | Binding Name | Access In Routes |
|---|---|---|
enrichment | ENRICHMENT_QUEUE | c.env.ENRICHMENT_QUEUE |
notifications | NOTIFICATIONS_QUEUE | c.env.NOTIFICATIONS_QUEUE |
my-jobs | MY_JOBS_QUEUE | c.env.MY_JOBS_QUEUE |
Queue Settings
Pass true for defaults, or an object to customize behavior:
export default {
name: 'My Game',
integrations: {
queues: {
fast: true, // all defaults
enrichment: {
maxBatchSize: 10,
maxRetries: 3,
maxBatchTimeout: 5,
maxConcurrency: 2,
retryDelay: 60,
},
},
},
}| Setting | Type | Description | Default |
|---|---|---|---|
maxBatchSize | number | Maximum messages per batch delivery | Platform default |
maxRetries | number | Max retry attempts before dead-lettering | Platform default |
maxBatchTimeout | number | Seconds to wait before delivering a partial batch | Platform default |
maxConcurrency | number | Max concurrent consumer invocations | Platform default |
retryDelay | number | Seconds to wait before retrying a failed message | Platform default |
deadLetterQueue | string | Queue key to receive permanently failed messages | None |
Dead-Letter Queues
Route permanently failed messages to another queue for inspection or reprocessing:
export default {
name: 'My Game',
integrations: {
queues: {
enrichment: {
maxRetries: 3,
deadLetterQueue: 'failed-jobs',
},
'failed-jobs': true,
},
},
}Dead-Letter Queue Rules
- The
deadLetterQueuevalue must reference another key inintegrations.queues - Circular references are detected and rejected at config validation time
Queue Key Rules
Queue keys must use lowercase letters, numbers, and hyphens only:
| Key | Valid | Why |
|---|---|---|
enrichment | Yes | Lowercase letters only |
my-jobs | Yes | Hyphens allowed |
process3 | Yes | Numbers allowed |
My_Jobs | No | Uppercase and underscores not allowed |
Sending Messages
Use queue bindings in any custom route via c.env:
export async function POST(c: Context): Promise<Response> {
const payload = await c.req.json()
await c.env.ENRICHMENT_QUEUE.send(payload)
return c.json({ ok: true })
}export async function POST(c: Context): Promise<Response> {
const payload = await c.req.json()
await c.env.ENRICHMENT_QUEUE.send({
type: 'enrich-word',
queuedAt: new Date().toISOString(),
payload,
})
return c.json({ ok: true })
}export async function POST(c: Context): Promise<Response> {
const playcademyUser = c.get('playcademyUser')
if (!playcademyUser) {
return c.json({ error: 'Not authenticated' }, 401)
}
await c.env.ENRICHMENT_QUEUE.send({
userId: playcademyUser.sub,
action: 'process-submission',
submittedAt: new Date().toISOString(),
})
return c.json({ ok: true, status: 'queued' })
}Consuming Messages
Handler Convention
Create handler files in server/queue/{queueKey}.ts. Each file must default-export an object with a queue method:
export default {
async queue(batch: MessageBatch<unknown>, env: PlaycademyEnv): Promise<void> {
for (const message of batch.messages) {
try {
const data = message.body as { type: string; payload: unknown }
// ... process message
message.ack()
} catch {
message.retry()
}
}
},
} satisfies QueueHandlerHandler Validation
The CLI validates that every queue declared in your config has a matching handler file, and every handler file has a matching config entry. Mismatches cause build errors:
- Missing handler:
Missing queue handler files for: enrichment. Add files under server/queue/{name}.ts - Missing config:
Queue handlers exist without config entries: enrichment. Declare them in integrations.queues
Message Lifecycle
Each message in a batch exposes two methods:
| Method | Description |
|---|---|
message.ack() | Acknowledge successful processing (removes from queue) |
message.retry() | Request redelivery (up to maxRetries attempts) |
Batch Processing
Messages are delivered in batches.
Process each message individually and call ack() or retry() per message to avoid reprocessing the entire batch on partial failures.
Accessing Message Data
interface NotificationPayload {
userId: string
type: 'achievement' | 'reminder'
message: string
}
export default {
async queue(batch: MessageBatch<unknown>): Promise<void> {
for (const message of batch.messages) {
const { userId, type, message: text } = message.body as NotificationPayload
// message.id — unique message identifier
// message.timestamp — when the message was enqueued
// message.attempts — number of delivery attempts
message.ack()
}
},
} satisfies QueueHandlerBatching
In production, messages are not delivered instantly. Cloudflare accumulates messages into batches and delivers them to your handler when either threshold is reached — whichever comes first:
| Threshold | Default | What happens |
|---|---|---|
maxBatchSize | 10 | Batch delivers as soon as this many messages queue up |
maxBatchTimeout | 5s | Batch delivers after this long, even if partially full |
For example, with defaults: if you send 10 messages quickly they deliver immediately as a full batch. If you send 1 message and stop, it delivers after ~5 seconds in a batch of one.
Local Queue Behavior
In local development, queue messages are delivered immediately to your handlers — no batching delay and no size threshold.
Tune both thresholds to match your workload:
export default {
name: 'My Game',
integrations: {
queues: {
enrichment: {
maxBatchSize: 5, // smaller batches, more frequent delivery
maxBatchTimeout: 1, // deliver partial batches after 1 second
},
},
},
}Latency vs Throughput
Lower thresholds mean faster delivery but more frequent, smaller batches. For high-throughput workloads, larger values are more efficient because they amortize per-batch overhead across more messages.
What's Next?
Custom Routes
Build API routes that enqueue work to your queues.
Deployment Guide
Learn how queue infrastructure is provisioned during deployment.
Cloudflare Queues Documentation
Deep dive into Cloudflare Queues' full capabilities and limits.
CLI Commands
Complete reference for all Playcademy CLI commands.
