Serverless Computing Guide 2026: AWS Lambda, Cloudflare Workers, and Vercel Edge
Advertisement
Serverless 2026: Functions Everywhere
Serverless means no servers to manage. Your code runs in response to events, scales to zero when idle, and scales to millions when needed. Pay only for what you use.
- AWS Lambda: Event-Driven Functions
- Serverless Framework Setup
- Cloudflare Workers: Edge Computing
- Cold Starts: The Serverless Trade-off
- When to Use Serverless
- Serverless Cost Comparison
AWS Lambda: Event-Driven Functions
// handler.ts — Typed Lambda handler
import type { Handler, APIGatewayProxyEventV2, APIGatewayProxyResultV2 } from 'aws-lambda'
interface User {
id: string
name: string
email: string
}
export const handler: Handler<APIGatewayProxyEventV2, APIGatewayProxyResultV2> = async (event) => {
const headers = {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE',
}
try {
const method = event.requestContext.http.method
const path = event.requestContext.http.path
if (method === 'GET' && path === '/users') {
const users = await getUsers()
return { statusCode: 200, headers, body: JSON.stringify(users) }
}
if (method === 'POST' && path === '/users') {
const body = JSON.parse(event.body || '{}')
const user = await createUser(body)
return { statusCode: 201, headers, body: JSON.stringify(user) }
}
return { statusCode: 404, headers, body: JSON.stringify({ error: 'Not found' }) }
} catch (error) {
console.error('Lambda error:', error)
return { statusCode: 500, headers, body: JSON.stringify({ error: 'Internal error' }) }
}
}
// Event handlers — Lambda triggers beyond HTTP
export const sqsHandler: Handler = async (event) => {
for (const record of event.Records) {
const message = JSON.parse(record.body)
await processMessage(message)
}
}
export const s3Handler: Handler = async (event) => {
for (const record of event.Records) {
const bucket = record.s3.bucket.name
const key = record.s3.object.key
await processS3Object(bucket, key)
}
}
export const scheduledHandler: Handler = async () => {
// Runs on schedule (EventBridge)
await runDailyReport()
}
Serverless Framework Setup
# serverless.yml
service: webcoderspeed-api
frameworkVersion: '3'
provider:
name: aws
runtime: nodejs20.x
region: us-east-1
memorySize: 512
timeout: 10
architecture: arm64 # 20% cheaper than x86
environment:
DATABASE_URL: ${ssm:/webcoderspeed/database-url}
REDIS_URL: ${ssm:/webcoderspeed/redis-url}
iam:
role:
statements:
- Effect: Allow
Action:
- s3:GetObject
- s3:PutObject
Resource: arn:aws:s3:::webcoderspeed-assets/*
functions:
api:
handler: dist/handler.handler
events:
- httpApi:
path: /{proxy+}
method: ANY
reservedConcurrency: 100 # Max 100 concurrent
processQueue:
handler: dist/queue.sqsHandler
events:
- sqs:
arn: !GetAtt ProcessingQueue.Arn
batchSize: 10
dailyReport:
handler: dist/cron.scheduledHandler
events:
- schedule: cron(0 9 * * ? *) # 9 AM UTC daily
plugins:
- serverless-offline # Local development
- serverless-esbuild # Fast TypeScript bundling
Cloudflare Workers: Edge Computing
// worker.ts — Runs in 250+ edge locations
interface Env {
KV_STORE: KVNamespace
API_KEY: string
}
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const url = new URL(request.url)
// Geo-personalization
const country = request.cf?.country
const city = request.cf?.city
// Route handling
if (url.pathname === '/api/geo') {
return Response.json({
country, city,
timezone: request.cf?.timezone,
latitude: request.cf?.latitude,
longitude: request.cf?.longitude,
})
}
// Cache expensive operations
if (url.pathname.startsWith('/api/posts')) {
const cacheKey = new Request(url.toString())
const cache = caches.default
const cached = await cache.match(cacheKey)
if (cached) return cached
const data = await fetchFromOrigin(url.pathname, env)
const response = Response.json(data, {
headers: { 'Cache-Control': 'public, max-age=300' }
})
await cache.put(cacheKey, response.clone())
return response
}
// KV storage (globally replicated)
if (url.pathname === '/api/config') {
const config = await env.KV_STORE.get('site-config', 'json')
return Response.json(config)
}
return new Response('Not found', { status: 404 })
},
// Scheduled trigger (cron)
async scheduled(event: ScheduledEvent, env: Env): Promise<void> {
console.log('Cron triggered:', event.cron)
await syncData(env)
},
}
# Deploy Cloudflare Worker
npm install -D wrangler
wrangler deploy
wrangler dev # Local development
# wrangler.toml
name = "webcoderspeed-api"
main = "worker.ts"
compatibility_date = "2026-03-26"
[[kv_namespaces]]
binding = "KV_STORE"
id = "abc123..."
Cold Starts: The Serverless Trade-off
Cold start = container initialization time before your code runs
AWS Lambda (Node.js): ~100-500ms cold start
Cloudflare Workers: ~0ms (always warm)
Vercel Edge: ~0ms (always warm)
Strategies to minimize cold starts:
1. Use arm64 (faster startup than x86)
2. Minimize package size (tree-shake, exclude dev deps)
3. Use Provisioned Concurrency (keep N instances warm, costs more)
4. Use smaller runtimes (Hono.js vs Express)
// Keep Lambda warm with EventBridge Scheduler
// Ping your function every 5 minutes to avoid cold starts
// Or: Provisioned Concurrency in serverless.yml
functions:
api:
provisionedConcurrency: 5 # Always 5 warm instances
When to Use Serverless
✓ Use serverless for:
- Irregular/unpredictable traffic
- Background jobs and queues
- Webhooks
- Event processors
- Cron jobs
- Short-lived tasks
✗ Avoid serverless for:
- Long-running processes (>15min, Lambda limit)
- Stateful applications
- Constant high traffic (EC2 is cheaper)
- WebSocket servers
- Apps requiring persistent connections
Serverless Cost Comparison
| Scenario | EC2 t3.small | Lambda | Cloudflare Workers |
|---|---|---|---|
| 1M requests/month | $15/month | ~$0.20 | Free tier |
| 10M requests/month | $15/month | ~$2.00 | $5/month |
| 100M requests/month | $15/month | ~$20 | $50/month |
Lambda becomes expensive above ~50M requests. For constant traffic, EC2 or containers win. For bursty/unpredictable: Lambda wins.
Advertisement