Edge Runtime — JavaScript runtime running on CDN edge locations (close to the user). Not Node.js — V8 Isolates (Cloudflare Workers, Vercel Edge), Deno Deploy, or Fastly Compute@Edge. 0ms cold start, <50ms global latency. Limits: Node API subset (no fs, worker_threads), 128 MB RAM, 50ms CPU per request.
Below: details, example, related terms, FAQ.
// Next.js Edge Function
export const runtime = 'edge';
export async function GET(request) {
const { searchParams } = new URL(request.url);
const city = searchParams.get('city');
// ~10ms latency globally
const weather = await fetch(`https://api.weather.gov/`);
return Response.json(await weather.json());
}Serverless (Lambda): Node.js, 1 region, 100-500ms cold start. Edge: V8 isolates, 300+ regions, 0ms cold start. Edge wins on latency, Lambda on heavy compute.
Native modules (sharp, bcrypt), unbounded file I/O, long tasks (>50ms CPU). Use serverless Lambda for those.
Yes for simple APIs (auth middleware, redirects, geolocation). For complex DB operations — regular Node.js lambda.