Skip to content

Edge Runtime

Key idea:

Edge Runtime — JavaScript runtime running on CDN edge locations (close to the user). Not Node.js — V8 Isolates (Cloudflare Workers, Vercel Edge), Deno Deploy, or Fastly Compute@Edge. 0ms cold start, <50ms global latency. Limits: Node API subset (no fs, worker_threads), 128 MB RAM, 50ms CPU per request.

Below: details, example, related terms, FAQ.

Try it now — free →

Details

  • Cloudflare Workers: 300+ locations, V8 Isolates (not containers)
  • Vercel Edge: Cloudflare backend, Next.js integration
  • Deno Deploy: V8 + native Deno APIs
  • Fastly Compute@Edge: WASM-based, any language
  • Limits: 50ms CPU, 128 MB RAM, no native modules, limited fs

Example

// Next.js Edge Function
export const runtime = 'edge';

export async function GET(request) {
  const { searchParams } = new URL(request.url);
  const city = searchParams.get('city');
  // ~10ms latency globally
  const weather = await fetch(`https://api.weather.gov/`);
  return Response.json(await weather.json());
}

Related Terms

Learn more

Frequently Asked Questions

Edge vs serverless?

Serverless (Lambda): Node.js, 1 region, 100-500ms cold start. Edge: V8 isolates, 300+ regions, 0ms cold start. Edge wins on latency, Lambda on heavy compute.

What is not allowed on edge?

Native modules (sharp, bcrypt), unbounded file I/O, long tasks (>50ms CPU). Use serverless Lambda for those.

Good for APIs?

Yes for simple APIs (auth middleware, redirects, geolocation). For complex DB operations — regular Node.js lambda.