Everyone knows Redis as a cache. Stuff data in, get it out fast. But Redis can do way more.
Ive used Redis for job queues, rate limiting, leaderboards, session storage, real-time messaging, and distributed locks. Lets explore the patterns that actually matter.
Redis Data Structures Overview
1. Pub/Sub for Real-Time
When you need to broadcast messages across servers:
import { createClient } from 'redis';
const publisher = createClient();
const subscriber = createClient();
await publisher.connect();
await subscriber.connect();
// Subscribe to channel
await subscriber.subscribe('notifications', (message, channel) => {
console.log(`Received on ${channel}: ${message}`);
});
// Publish from anywhere
await publisher.publish('notifications', JSON.stringify({
type: 'order_created',
orderId: '12345',
userId: 'user_abc'
}));
Use cases:
- WebSocket scaling (see my WebSocket post)
- Cache invalidation across servers
- Real-time notifications
Limitation: Messages are fire-and-forget. If no one is listening, the message is lost.
2. Sorted Sets for Leaderboards
Sorted sets keep elements ranked by score. Perfect for leaderboards:
const redis = createClient();
// Add scores
await redis.zAdd('leaderboard', [
{ score: 1500, value: 'player1' },
{ score: 2300, value: 'player2' },
{ score: 1800, value: 'player3' },
]);
// Get top 10
const top10 = await redis.zRange('leaderboard', 0, 9, { REV: true });
// ['player2', 'player3', 'player1']
// Get player rank (0-indexed)
const rank = await redis.zRevRank('leaderboard', 'player1');
// 2 (third place)
// Get player score
const score = await redis.zScore('leaderboard', 'player1');
// 1500
// Increment score
await redis.zIncrBy('leaderboard', 100, 'player1');
// player1 now has 1600 points
Use cases:
- Gaming leaderboards
- Top posts/content
- Rate limiting windows
- Priority queues
3. Rate Limiting with Sliding Window
Fixed windows have edge cases. Sliding window is better:
async function isRateLimited(
userId: string,
limit: number,
windowSeconds: number
): Promise<boolean> {
const key = `ratelimit:${userId}`;
const now = Date.now();
const windowStart = now - (windowSeconds * 1000);
// Remove old entries
await redis.zRemRangeByScore(key, 0, windowStart);
// Count requests in window
const count = await redis.zCard(key);
if (count >= limit) {
return true; // Rate limited
}
// Add new request
await redis.zAdd(key, { score: now, value: `${now}` });
await redis.expire(key, windowSeconds);
return false;
}
// Usage: 100 requests per minute
if (await isRateLimited(userId, 100, 60)) {
throw new Error('Too many requests');
}
4. Distributed Locks
When only one process should run something:
async function acquireLock(
lockName: string,
ttlSeconds: number
): Promise<string | null> {
const lockId = crypto.randomUUID();
const key = `lock:${lockName}`;
// SET NX = only set if not exists
const acquired = await redis.set(key, lockId, {
NX: true,
EX: ttlSeconds
});
return acquired ? lockId : null;
}
async function releaseLock(lockName: string, lockId: string): Promise<boolean> {
const key = `lock:${lockName}`;
// Only delete if we own the lock (Lua script for atomicity)
const script = `
if redis.call("get", KEYS[1]) == ARGV[1] then
return redis.call("del", KEYS[1])
else
return 0
end
`;
const result = await redis.eval(script, {
keys: [key],
arguments: [lockId]
});
return result === 1;
}
// Usage
const lockId = await acquireLock('process-orders', 30);
if (lockId) {
try {
await processOrders();
} finally {
await releaseLock('process-orders', lockId);
}
} else {
console.log('Another process is already running');
}
Important: Always use TTL to prevent deadlocks if a process crashes.
5. Session Storage
Redis is great for sessions - fast, shared across servers, auto-expiring:
interface Session {
userId: string;
email: string;
roles: string[];
createdAt: number;
}
async function createSession(sessionId: string, data: Session): Promise<void> {
await redis.hSet(`session:${sessionId}`, {
userId: data.userId,
email: data.email,
roles: JSON.stringify(data.roles),
createdAt: data.createdAt.toString()
});
await redis.expire(`session:${sessionId}`, 86400); // 24 hours
}
async function getSession(sessionId: string): Promise<Session | null> {
const data = await redis.hGetAll(`session:${sessionId}`);
if (!data.userId) return null;
return {
userId: data.userId,
email: data.email,
roles: JSON.parse(data.roles),
createdAt: parseInt(data.createdAt)
};
}
async function refreshSession(sessionId: string): Promise<void> {
await redis.expire(`session:${sessionId}`, 86400);
}
6. Streams for Event Sourcing
Redis Streams are like Kafka-lite. Great for event logs:
// Produce events
await redis.xAdd('orders', '*', {
event: 'order_created',
orderId: '12345',
userId: 'user_abc',
total: '99.99'
});
// Consume events (blocking read)
const events = await redis.xRead(
{ key: 'orders', id: '$' }, // $ = only new messages
{ BLOCK: 5000, COUNT: 10 }
);
// Consumer groups for distributed processing
await redis.xGroupCreate('orders', 'notification-workers', '0', { MKSTREAM: true });
// Read as part of consumer group
const messages = await redis.xReadGroup(
'notification-workers',
'worker-1',
{ key: 'orders', id: '>' },
{ COUNT: 10 }
);
// Acknowledge processed messages
await redis.xAck('orders', 'notification-workers', messageId);
When to use Streams vs Pub/Sub:
- Pub/Sub: Fire and forget, no persistence
- Streams: Need message history, consumer groups, acknowledgment
7. Caching Patterns
Since were talking Redis, lets cover caching patterns too:
// Cache-aside pattern
async function getUser(userId: string): Promise<User> {
const cached = await redis.get(`user:${userId}`);
if (cached) {
return JSON.parse(cached);
}
const user = await db.users.findById(userId);
await redis.set(`user:${userId}`, JSON.stringify(user), { EX: 3600 });
return user;
}
// Write-through pattern
async function updateUser(userId: string, data: Partial<User>): Promise<User> {
const user = await db.users.update(userId, data);
await redis.set(`user:${userId}`, JSON.stringify(user), { EX: 3600 });
return user;
}
// Cache invalidation
async function invalidateUser(userId: string): Promise<void> {
await redis.del(`user:${userId}`);
}
Quick Reference
| Pattern | Data Structure | Use Case | |---------|---------------|----------| | Caching | Strings/Hashes | Speeding up reads | | Rate Limiting | Sorted Sets | API throttling | | Leaderboards | Sorted Sets | Rankings, top-N | | Real-time | Pub/Sub | WebSocket scaling | | Event Logs | Streams | Event sourcing | | Sessions | Hashes | User sessions | | Locks | Strings + NX | Distributed locking | | Queues | Lists | Simple job queues |
Further Reading
- Redis University - Free courses
- Redis Best Practices
- ioredis - Feature-rich Redis client for Node.js
Redis is a Swiss Army knife. Most developers only use the knife blade (caching). Learn the other tools and youll solve problems faster with less infrastructure.
