Serverless Containers — run Docker images in a managed environment with per-request/per-second billing and scale-to-zero. Unlike Lambda (short request, small bundle) they support long-running workloads, more RAM/CPU, any language or binary. Google Cloud Run, AWS ECS Fargate (with Fargate Spot), Azure Container Apps, Fly.io Machines. Ideal for API servers, background jobs, AI inference (with GPU).
Below: details, example, related terms, FAQ.
# Deploy a Docker image to Cloud Run
$ gcloud run deploy api \
--image=gcr.io/project/app:v1 \
--min-instances=0 \
--max-instances=100 \
--memory=1Gi \
--cpu=1 \
--timeout=300 \
--concurrency=80Container: long-running, large bundle, GPU, exotic language. Lambda: fast HTTP APIs (<30 s), tight bundle, native Node/Python/Go.
Low-traffic APIs — no (2 s cold-start is fine). Real-time — keep min-instances >= 1.
Fly.io Machines — $2/month for 256 MB / shared CPU. Cloud Run — pay-per-request, worst when idle.