RunPod GPU Pricing, Specs & Review

Last updated: 2026-03-17

We earn commissions when you shop through the links on this page. Learn more

Overview

RunPod is a developer-friendly GPU cloud platform offering affordable on-demand and spot GPU instances alongside serverless GPU endpoints. With pricing starting at $0.17/hr for basic GPUs and H100 instances at $2.79/hr (77% cheaper than AWS), RunPod is popular among AI developers, researchers, and startups. RunPod supports a wide range of NVIDIA GPUs including H100, H200, B200, A100, RTX 4090, and RTX 3090, with both community and secure cloud options.

GPU Pricing

GPUVRAMOn-Demand $/hrSpot $/hrMonthly
B200192 GB$5.98$4.19~$4,306
H200141 GB$3.59$2.51~$2,585
H100 SXM80 GB$2.79$1.95~$2,009
A100 SXM 80GB80 GB$1.64$1.15~$1,181
A100 PCIe 40GB40 GB$0.60$0.42~$432
RTX A600048 GB$0.49$0.34~$353
RTX 409024 GB$0.44$0.31~$317
RTX 309024 GB$0.22$0.15~$158

View GPU instances on RunPod →

Key Features

Pros & Cons

Pros

Cons

FAQ

What GPU models does RunPod offer?

RunPod offers NVIDIA H100, H200, B200, A100 80GB/40GB, A40, RTX 4090, RTX 3090, and RTX A6000 GPUs in both secure cloud and community cloud configurations.

What is the cheapest GPU option on RunPod?

RunPod instances start at $0.17/hr for basic GPUs. RTX 3090 on-demand is $0.22/hr, A100 PCIe 40GB from $0.60/hr, and H100 from $2.79/hr.

Does RunPod offer spot pricing?

Yes. RunPod offers spot (interruptible) instances with 30-60% savings over on-demand pricing. Spot instances are ideal for fault-tolerant training workloads.

Compare Providers

GPU specifications comparison cloud GPU pricing comparison AI accelerators comparison AI inference benchmarks