Published on

Edge Functions Architecture — Running Code at 300+ Locations Without a Server

Authors

Introduction

Edge functions execute your code at 300+ geographic locations, reducing latency from 100-200ms to 10-50ms. Cloudflare Workers, Vercel Edge Functions, and Lambda@Edge each have different constraints and cost models. This post covers edge architecture, state management at the edge, database choices, and the operational tradeoffs.

Cloudflare Workers vs Vercel Edge vs Lambda@Edge Comparison

Cloudflare Workers execute globally on every request. They run on Cloudflare's network (not AWS), with unlimited invocations. Pricing is straightforward: flat 5/monthfor10Mrequests,5/month for 10M requests, 0.50 per 10M additional.

Vercel Edge Functions deploy alongside your Next.js app. They execute in Vercel's global network and integrate tightly with the framework. Pricing is generous for hobby/pro plans, then $0.65 per 1M invocations.

Lambda@Edge runs on AWS CloudFront CDN. It's powerful but cold starts are slower (100-500ms vs Cloudflare's 10ms). Pricing is complex: per-request + data transfer fees.

// Feature comparison table
const platforms = {
  cloudflare: {
    coldStartMs: 10,
    cpuMs: 50,
    memoryMB: 128,
    timeout: '30 seconds',
    durableObjects: 'Yes',
    price: 'flat $5/month',
  },
  vercelEdge: {
    coldStartMs: 50,
    cpuMs: 100,
    memoryMB: 512,
    timeout: '10 seconds',
    durableObjects: 'No',
    price: '$0.65 per 1M requests',
  },
  lambda: {
    coldStartMs: 100,
    cpuMs: 100,
    memoryMB: 3008,
    timeout: '30 seconds',
    durableObjects: 'No',
    price: '$0.000002 per request + transfer',
  },
};

Choose Cloudflare for simple middleware (auth, redirect, personalization). Choose Vercel Edge for Next.js SSR rendering at the edge. Choose Lambda@Edge for complex logic requiring high memory/CPU.

Edge-Compatible Constraints

Edge runtimes restrict your code to prevent abuse. There is no filesystem, limited CPU time (10-50ms), and tight memory (128-512MB).

You cannot:

  • Import fs, net, or child_process modules
  • Make unbounded loops (timeout kills execution)
  • Read from POSIX filesystem
  • Use Node.js APIs that require syscalls

You can:

  • Make HTTP requests (fetch, WebSocket)
  • Use KV storage or Durable Objects (state)
  • Deploy TypeScript, bundled as single file
  • Access request headers, cookies, geolocation
// cloudflare-worker.ts - Production middleware
import { Router } from 'itty-router';

interface Env {
  KV: KVNamespace;
  DURABLE_OBJECT: DurableObjectNamespace;
}

const router = Router();

// Auth middleware
router.all('*', async (request: Request, env: Env) => {
  const token = request.headers.get('Authorization')?.replace('Bearer ', '');

  if (!token) {
    return new Response('Unauthorized', { status: 401 });
  }

  // Validate token with short TTL cache
  const cached = await env.KV.get(`token:${token}`);
  if (cached === 'invalid') {
    return new Response('Unauthorized', { status: 401 });
  }

  if (!cached) {
    const isValid = await validateTokenRemote(token);
    if (!isValid) {
      await env.KV.put(`token:${token}`, 'invalid', { expirationTtl: 300 });
      return new Response('Unauthorized', { status: 401 });
    }
    await env.KV.put(`token:${token}`, 'valid', { expirationTtl: 3600 });
  }
});

// Geolocation routing
router.get('/api/*', async (request: Request) => {
  const country = request.headers.get('cf-ipcountry') || 'US';

  if (country === 'CN') {
    return fetch('https://cdn-china.example.com' + new URL(request.url).pathname);
  }

  return fetch('https://cdn-global.example.com' + new URL(request.url).pathname);
});

// Rate limiting
router.post('/api/limited', async (request: Request, env: Env) => {
  const ip = request.headers.get('cf-connecting-ip') || 'unknown';
  const key = `ratelimit:${ip}`;

  const current = await env.KV.get(key);
  const count = current ? parseInt(current) : 0;

  if (count > 100) {
    return new Response('Rate limited', { status: 429 });
  }

  await env.KV.put(key, String(count + 1), { expirationTtl: 60 });

  return new Response('OK');
});

async function validateTokenRemote(token: string): Promise<boolean> {
  const response = await fetch('https://auth.example.com/validate', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ token }),
  });

  return response.ok;
}

export default router.handle;

KV Storage and Durable Objects for State at Edge

Cloudflare KV stores strings globally with eventual consistency. Perfect for:

  • Session tokens (TTL 1 hour)
  • User preferences (LRU cache)
  • Feature flags (refresh on deploy)

For strongly-consistent state, use Durable Objects. Each object is pinned to one location and handles all requests for that key, guaranteeing serialized access.

// durable-object.ts - Rate limiter state machine
import { Router } from 'itty-router';

interface RateLimitState {
  count: number;
  resetAt: number;
}

export class RateLimiter {
  state: DurableObjectState;
  storage: DurableObjectStorage;
  internal: Map<string, RateLimitState> = new Map();

  constructor(state: DurableObjectState) {
    this.state = state;
    this.storage = state.storage;

    // Restore state from storage on init
    this.state.blockConcurrencyWhile(async () => {
      const stored = await this.storage.get<string>('rate-limits');
      if (stored) {
        this.internal = new Map(JSON.parse(stored));
      }
    });
  }

  async fetch(request: Request): Promise<Response> {
    const router = Router();

    router.post('/check', async (req: Request) => {
      const { identifier, limit, windowSeconds } = await req.json<{
        identifier: string;
        limit: number;
        windowSeconds: number;
      }>();

      const now = Date.now() / 1000;
      const entry = this.internal.get(identifier);

      if (!entry || entry.resetAt < now) {
        // New window
        this.internal.set(identifier, { count: 1, resetAt: now + windowSeconds });
        await this.persist();
        return new Response(JSON.stringify({ allowed: true, remaining: limit - 1 }));
      }

      if (entry.count >= limit) {
        const retryAfter = Math.ceil(entry.resetAt - now);
        return new Response(JSON.stringify({ allowed: false, retryAfter }), { status: 429 });
      }

      entry.count++;
      await this.persist();
      return new Response(JSON.stringify({ allowed: true, remaining: limit - entry.count }));
    });

    return router.handle(request);
  }

  private async persist(): Promise<void> {
    await this.storage.put('rate-limits', JSON.stringify(Array.from(this.internal.entries())));
  }
}

export { RateLimiter };

Edge Middleware for Auth and Personalization

Authenticate at the edge before hitting origin. This blocks attackers early and personalizes responses without origin latency.

// vercel-edge-auth.ts - Vercel Edge Function
import { NextRequest, NextResponse } from 'next/server';
import { jwtVerify } from 'jose';

const secret = new TextEncoder().encode(process.env.JWT_SECRET || 'secret');

export async function middleware(request: NextRequest) {
  const pathname = request.nextUrl.pathname;

  // Skip auth for public routes
  if (pathname === '/login' || pathname.startsWith('/public/')) {
    return NextResponse.next();
  }

  // Extract JWT from Authorization header
  const token = request.headers.get('authorization')?.replace('Bearer ', '');

  if (!token) {
    return NextResponse.redirect(new URL('/login', request.url));
  }

  try {
    const verified = await jwtVerify(token, secret);
    const requestHeaders = new Headers(request.headers);

    // Attach user info for origin to read
    requestHeaders.set('x-user-id', String(verified.payload.sub));
    requestHeaders.set('x-user-email', String(verified.payload.email));

    // Personalize cache key for CDN
    requestHeaders.set('x-cache-key-suffix', `user-${verified.payload.sub}`);

    return NextResponse.next({ request: { headers: requestHeaders } });
  } catch (err) {
    return NextResponse.redirect(new URL('/login', request.url));
  }
}

export const config = {
  matcher: ['/((?!public|login).*)'],
};

Geolocation-Based Routing

Route requests to origin based on user location. Cloudflare provides cf-ipcountry, cf-metro-code, cf-continent.

// geolocation-routing.ts
export async function onRequest(context: {
  request: Request;
  env: { API_ENDPOINT: string };
}): Promise<Response> {
  const { request, env } = context;
  const country = request.headers.get('cf-ipcountry') || 'US';
  const continent = request.headers.get('cf-continent') || 'NA';

  // Route to regional API
  const regionMap: Record<string, string> = {
    EU: 'https://api-eu.example.com',
    AS: 'https://api-asia.example.com',
    NA: 'https://api-us.example.com',
  };

  const originUrl = regionMap[continent] || 'https://api.example.com';
  const proxyUrl = new URL(originUrl);
  proxyUrl.pathname = new URL(request.url).pathname;
  proxyUrl.search = new URL(request.url).search;

  // Forward request with user location header
  const headers = new Headers(request.headers);
  headers.set('x-user-country', country);
  headers.set('x-user-continent', continent);

  return fetch(new Request(proxyUrl.toString(), {
    method: request.method,
    headers,
    body: request.body,
  }));
}

Edge-Compatible Databases (Turso, D1, Neon Serverless)

Traditional databases require persistent connections. Edge functions live seconds. Use serverless databases with connection pooling or HTTP APIs.

Turso (SQLite replica + HTTP API) offers instant connections, global replicas, and embedded JS SDK.

Neon Serverless provides connection pooling, bypassing connection overhead.

D1 (Cloudflare native) stores data in Durable Objects, zero network latency.

// edge-db-query.ts - Turso example
import { createClient } from '@libsql/client/web';

export interface Env {
  DB_URL: string;
  DB_TOKEN: string;
}

export async function onRequest(context: {
  request: Request;
  env: Env;
}): Promise<Response> {
  const { request, env } = context;

  const client = createClient({
    url: env.DB_URL,
    authToken: env.DB_TOKEN,
  });

  try {
    // HTTP-based query (no persistent connection)
    const result = await client.execute({
      sql: `
        SELECT id, email, created_at FROM users
        WHERE country = ?
        ORDER BY created_at DESC
        LIMIT 50
      `,
      args: [request.headers.get('cf-ipcountry') || 'US'],
    });

    return Response.json(result.rows);
  } catch (error) {
    return new Response(`Query failed: ${error}`, { status: 500 });
  }
}

When Edge Adds Complexity, Not Performance

Edge is not always faster. If your logic requires:

  • Large data processing (>1MB)
  • Multiple sequential API calls
  • Complex business logic with side effects
  • Database transactions (use origin)

Then keep code on origin. Edge adds value for:

  • Auth/token validation
  • Simple redirects and rewrites
  • Request filtering
  • Geolocation routing

Cost trap: Paying for edge execution when origin is the bottleneck. Profile first.

Cold Start: Edge vs Lambda Comparison

PlatformCold startCPU timeMemoryConcurrency
Cloudflare Workers10ms50ms128MBUnlimited
Vercel Edge50ms100ms512MBPer-region limit
Lambda@Edge100-500ms100ms3008MBQueued
EC2 always-warm0msUnlimitedUnlimitedPaid always

Cloudflare wins on cold start. Lambda wins on resources. For 99% of edge use cases, Cloudflare Workers' 10ms cold start is imperceptible.

Checklist

  • Choose platform based on logic complexity and geography (Cloudflare/Vercel/Lambda)
  • Identify edge-compatible constraints (no fs, no persistent connections)
  • Cache tokens and feature flags in KV with short TTL
  • Use Durable Objects for strongly-consistent rate limiting
  • Validate JWT at edge before proxying to origin
  • Route requests by geolocation (cf-ipcountry) for regional APIs
  • Use Turso/D1 for edge-friendly databases (HTTP API)
  • Benchmark: measure origin latency vs edge latency gains
  • Keep business logic on origin; move auth/routing to edge
  • Monitor edge execution time and error rates

Conclusion

Edge functions collapse latency for authenticated users and simple middleware. Cloudflare Workers excel at global auth and request routing with zero cold starts. Vercel Edge integrates cleanly with Next.js SSR. Lambda@Edge suits heavy computation. Start with edge for authentication; measure if you need more. The best edge deployment is the one you don't overengineer.