NVIDIA A100 PCIe: Specs, Pricing & Cloud Availability
Last updated: 2026-03-19
We earn commissions when you shop through the links on this page. Learn more
Technical Specifications
| Architecture | Ampere |
|---|---|
| VRAM | 80 GB HBM2e |
| Memory Bandwidth | 2.0 TB/s |
| FP16 Performance | 624 TFLOPS |
| FP8 Performance | 624 TFLOPS |
| TDP | 300W |
| Interconnect | PCIe Gen4 |
Cloud Pricing
| Provider | On-Demand $/hr | Spot $/hr | Availability |
|---|---|---|---|
| Microsoft Azure | $1.20-$2.50/hr | N/A | Available |
| RunPod | $1.20-$2.50/hr | N/A | Available |
| Lambda Labs | $1.20-$2.50/hr | N/A | Available |
| CoreWeave | $1.20-$2.50/hr | N/A | Available |
| Together AI | $1.20-$2.50/hr | N/A | Available |
| Vast.ai | $1.20-$2.50/hr | N/A | Available |
| Vultr | $1.20-$2.50/hr | N/A | Available |
| Oracle Cloud (OCI) | $1.20-$2.50/hr | N/A | Available |
| Cudo Compute | $1.20-$2.50/hr | N/A | Available |
| FluidStack | $1.20-$2.50/hr | N/A | Available |
| Paperspace (DigitalOcean) | $1.20-$2.50/hr | N/A | Available |
Benchmarks
The NVIDIA A100 PCIe delivers 624 TFLOPS FP16 and 312 TFLOPS FP32 performance with 2.0 TB/s memory bandwidth.
Best Use Cases
The NVIDIA A100 PCIe is optimized for Cost-effective ML workloads.