I recently faced a challenge in one of my projects: processing thousands of image conversions without blocking user requests. The solution? A distributed task queue. After testing various tools, I discovered BullMQ with Redis offers exceptional performance for background job processing. Today I’ll share how to build this system using TypeScript for robust type safety. Follow along to transform how you handle asynchronous tasks.
First, why choose BullMQ? It outperforms alternatives with its Redis foundation, offering superior speed and horizontal scaling. Unlike MongoDB-based solutions, BullMQ handles job priorities and retries more effectively. Its TypeScript-native design ensures better developer experience too. Have you considered how job queues could simplify your architecture?
Let’s set up our environment. Start a new TypeScript project and install essentials:
npm init -y
npm install bullmq redis ioredis
npm install @types/node typescript tsx --save-dev
Configure TypeScript with strict type checking:
// tsconfig.json
{
"compilerOptions": {
"target": "ES2020",
"module": "commonjs",
"strict": true,
"esModuleInterop": true,
"outDir": "./dist"
}
}
For Redis, use Docker Compose:
# docker-compose.yml
services:
redis:
image: redis:7-alpine
ports: ["6379:6379"]
Now the core implementation. Define queue configurations first:
// src/queue.config.ts
export const redisConfig = {
host: process.env.REDIS_HOST || 'localhost',
port: parseInt(process.env.REDIS_PORT || '6379')
};
export const jobOptions = {
attempts: 3,
backoff: { type: 'exponential', delay: 2000 }
};
Create a queue manager class:
// src/QueueManager.ts
import { Queue } from 'bullmq';
import { redisConfig, jobOptions } from './queue.config';
export class QueueManager {
private queues = new Map<string, Queue>();
createQueue(name: string): Queue {
const queue = new Queue(name, {
connection: redisConfig,
defaultJobOptions: jobOptions
});
this.queues.set(name, queue);
return queue;
}
async addJob(queueName: string, data: any): Promise<void> {
const queue = this.queues.get(queueName);
if (!queue) throw new Error(`Queue ${queueName} missing`);
await queue.add('process', data);
}
}
Now implement a worker for processing jobs:
// src/email.worker.ts
import { Worker } from 'bullmq';
import { redisConfig } from './queue.config';
const worker = new Worker('email-queue', async job => {
const { recipient, content } = job.data;
// Simulate email sending
console.log(`Sending email to ${recipient}`);
await new Promise(resolve => setTimeout(resolve, 1000));
return { success: true };
}, { connection: redisConfig, concurrency: 5 });
worker.on('completed', job => {
console.log(`Job ${job.id} completed`);
});
For advanced scenarios, implement priority handling:
// High-priority job example
await queue.add('urgent-email', payload, {
priority: 1, // Highest priority
delay: 5000 // Process after 5 seconds
});
What happens when jobs fail? BullMQ automatically retries based on your configuration. For monitoring, I recommend the Bull Board UI:
// src/monitor.ts
import { createBullBoard } from '@bull-board/api';
import { BullMQAdapter } from '@bull-board/api/bullMQAdapter';
import { ExpressAdapter } from '@bull-board/express';
const serverAdapter = new ExpressAdapter();
createBullBoard({
queues: [new BullMQAdapter(emailQueue)],
serverAdapter
});
app.use('/queues', serverAdapter.getRouter());
In production, deploy multiple workers across instances. Use process managers like PM2:
pm2 start dist/email.worker.js -i 4 --name "email_worker"
Common pitfalls? Always validate job data before processing and implement proper connection error handling. Remember to drain queues gracefully during shutdowns. How might you handle sudden Redis disconnections?
I’ve used this pattern to process over 50,000 daily jobs with consistent performance. The combination of BullMQ’s reliability and TypeScript’s type safety significantly reduced our error rates. What background tasks could you offload to queues?
If you found this guide helpful, share it with your team or colleagues working on performance optimization. Have questions or additional tips? Leave a comment below - I’d love to hear about your queue implementations!