RunPod GPU Pricing, Specs & Review
Last updated: 2026-03-17
Overview
RunPod is a developer-friendly GPU cloud platform offering affordable on-demand and spot GPU instances alongside serverless GPU endpoints. With pricing starting at $0.17/hr for basic GPUs and H100 instances at $2.79/hr (77% cheaper than AWS), RunPod is popular among AI developers, researchers, and startups. RunPod supports a wide range of NVIDIA GPUs including H100, H200, B200, A100, RTX 4090, and RTX 3090, with both community and secure cloud options.
GPU Pricing
| GPU | VRAM | On-Demand $/hr | Spot $/hr | Monthly |
|---|---|---|---|---|
| B200 | 192 GB | $5.98 | $4.19 | ~$4,306 |
| H200 | 141 GB | $3.59 | $2.51 | ~$2,585 |
| H100 SXM | 80 GB | $2.79 | $1.95 | ~$2,009 |
| A100 SXM 80GB | 80 GB | $1.64 | $1.15 | ~$1,181 |
| A100 PCIe 40GB | 40 GB | $0.60 | $0.42 | ~$432 |
| RTX A6000 | 48 GB | $0.49 | $0.34 | ~$353 |
| RTX 4090 | 24 GB | $0.44 | $0.31 | ~$317 |
| RTX 3090 | 24 GB | $0.22 | $0.15 | ~$158 |
Key Features
- On-demand and spot GPU instances
- Serverless GPU endpoints for inference
- Wide GPU selection: H100, H200, B200, A100, RTX 4090, RTX 3090
- Community and secure cloud deployment options
- Per-second billing with no minimum commitment
- Pre-built templates for popular ML frameworks
- API and CLI for programmatic deployment
- Up to 77% cheaper than hyperscale providers
Pros & Cons
Pros
- Very affordable pricing (H100 at $2.79/hr vs $12.29 on AWS)
- Spot instances with additional 30-60% savings
- Serverless GPU endpoints for auto-scaling inference
- Wide range of GPU types including consumer GPUs
- Fast provisioning with pre-built ML templates
Cons
- Community cloud may have variable reliability
- Less enterprise support compared to hyperscalers
- Spot instances can be interrupted
- Fewer compliance certifications
FAQ
What GPU models does RunPod offer?
RunPod offers NVIDIA H100, H200, B200, A100 80GB/40GB, A40, RTX 4090, RTX 3090, and RTX A6000 GPUs in both secure cloud and community cloud configurations.
What is the cheapest GPU option on RunPod?
RunPod instances start at $0.17/hr for basic GPUs. RTX 3090 on-demand is $0.22/hr, A100 PCIe 40GB from $0.60/hr, and H100 from $2.79/hr.
Does RunPod offer spot pricing?
Yes. RunPod offers spot (interruptible) instances with 30-60% savings over on-demand pricing. Spot instances are ideal for fault-tolerant training workloads.