Redis with Node.js — Caching and Pub/Sub
Advertisement
Redis is the fastest in-memory data structure store. Essential for caching, sessions, and real-time messaging.
Setup
npm install redis
npm install --save-dev @types/redis
# Or use ioredis
npm install ioredis
Basic Connection
import { createClient } from "redis";
const client = createClient({
host: "localhost",
port: 6379,
password: process.env.REDIS_PASSWORD,
});
await client.connect();
// Simple commands
await client.set("key", "value");
const value = await client.get("key");
await client.del("key");
// With expiration
await client.setEx("session", 3600, JSON.stringify({ userId: 1 }));
Caching Pattern
async function getCachedUser(userId: number) {
// Check cache first
const cached = await client.get(`user:${userId}`);
if (cached) {
return JSON.parse(cached);
}
// Fetch from database
const user = await db.users.findById(userId);
// Store in cache for 1 hour
await client.setEx(
`user:${userId}`,
3600,
JSON.stringify(user)
);
return user;
}
// Invalidate cache
async function updateUser(userId: number, data: any) {
const updated = await db.users.update(userId, data);
await client.del(`user:${userId}`);
return updated;
}
- Setup
- Basic Connection
- Caching Pattern
- Data Structures
- Pub/Sub
- Session Storage
- Rate Limiting with Redis
- Job Queue Pattern
- Transactions
- Connection Management
- Best Practices
- FAQ
Data Structures
// Strings
await client.set("count", "0");
await client.incr("count"); // 1
await client.incrBy("count", 5); // 6
// Lists
await client.lPush("queue", "job1");
await client.lPush("queue", "job2");
const job = await client.lPop("queue"); // "job2"
// Sets
await client.sAdd("tags", "typescript");
await client.sAdd("tags", "nodejs");
const tags = await client.sMembers("tags");
// Hashes
await client.hSet("user:1", {
name: "Alice",
email: "alice@example.com",
});
const user = await client.hGetAll("user:1");
// Sorted Sets
await client.zAdd("leaderboard", [
{ score: 100, member: "alice" },
{ score: 90, member: "bob" },
]);
const top = await client.zRange("leaderboard", 0, 9, {
WITHSCORES: true,
});
Pub/Sub
// Publisher
async function publishEvent(channel: string, data: any) {
await client.publish(channel, JSON.stringify(data));
}
// Subscriber
const subscriber = client.duplicate();
await subscriber.connect();
subscriber.subscribe("user:events", (message) => {
console.log("Event:", JSON.parse(message));
});
// Usage
await publishEvent("user:events", {
type: "user:created",
userId: 123,
});
Session Storage
import session from "express-session";
import RedisStore from "connect-redis";
const sessionStore = new RedisStore({ client });
app.use(
session({
store: sessionStore,
secret: "your-secret",
resave: false,
saveUninitialized: false,
cookie: {
secure: true,
httpOnly: true,
maxAge: 1000 * 60 * 60 * 24, // 24 hours
},
})
);
Rate Limiting with Redis
async function checkRateLimit(userId: string, limit = 100, window = 3600) {
const key = `rate:${userId}`;
const current = await client.incr(key);
if (current === 1) {
await client.expire(key, window);
}
return current <= limit;
}
// Express middleware
const rateLimitMiddleware = async (req: any, res: any, next: any) => {
const allowed = await checkRateLimit(req.userId);
if (!allowed) {
return res.status(429).json({ error: "Rate limited" });
}
next();
};
Job Queue Pattern
// Producer
async function addJob(jobType: string, data: any) {
await client.lPush(
"jobs",
JSON.stringify({ type: jobType, data, createdAt: new Date() })
);
}
// Consumer
async function processJobs() {
while (true) {
const job = await client.rPop("jobs");
if (!job) {
await new Promise((resolve) => setTimeout(resolve, 1000));
continue;
}
const parsed = JSON.parse(job);
console.log(`Processing ${parsed.type}`);
// Handle job
}
}
processJobs();
Transactions
// Pipeline (batch operations)
const pipeline = client.multi();
pipeline.set("key1", "value1");
pipeline.set("key2", "value2");
pipeline.get("key1");
const results = await pipeline.exec();
// WATCH for optimistic locking
await client.watch("balance");
const balance = await client.get("balance");
const newBalance = parseInt(balance || "0") + 100;
const transaction = client.multi();
transaction.set("balance", newBalance.toString());
const result = await transaction.exec();
Connection Management
const client = createClient({
socket: {
host: "localhost",
port: 6379,
reconnectStrategy: (retries) => {
if (retries > 10) {
return new Error("Max retries exceeded");
}
return retries * 100;
},
},
});
client.on("error", (err) => console.error("Redis error:", err));
client.on("connect", () => console.log("Redis connected"));
client.on("disconnect", () => console.log("Redis disconnected"));
Best Practices
- Use appropriate data structures - sets for membership, hashes for objects
- Set TTLs - prevent memory bloat
- Use pipelines - batch operations for efficiency
- Monitor memory - use
INFO memoryto check - Implement retry logic - handle connection failures gracefully
FAQ
Q: Can I use Redis as primary database? A: No, it's a cache/session store. Use with persistent database.
Q: What's the maximum size? A: Limited by available RAM. Typically 5-50GB in production.
Q: Should I use Redis for everything? A: No, use sparingly. Database queries, file operations don't benefit. Cache hot data.
Redis is essential for high-performance applications. Master it for caching, sessions, and real-time messaging.
Advertisement
← Previous
MongoDB with Node.js — Complete Guide