NVIDIA A100 PCIe: Specs, Pricing & Cloud Availability

Last updated: 2026-03-19

We earn commissions when you shop through the links on this page. Learn more

Technical Specifications

ArchitectureAmpere
VRAM80 GB HBM2e
Memory Bandwidth2.0 TB/s
FP16 Performance624 TFLOPS
FP8 Performance624 TFLOPS
TDP300W
InterconnectPCIe Gen4

Cloud Pricing

ProviderOn-Demand $/hrSpot $/hrAvailability
Microsoft Azure$1.20-$2.50/hrN/AAvailable
RunPod$1.20-$2.50/hrN/AAvailable
Lambda Labs$1.20-$2.50/hrN/AAvailable
CoreWeave$1.20-$2.50/hrN/AAvailable
Together AI$1.20-$2.50/hrN/AAvailable
Vast.ai$1.20-$2.50/hrN/AAvailable
Vultr$1.20-$2.50/hrN/AAvailable
Oracle Cloud (OCI)$1.20-$2.50/hrN/AAvailable
Cudo Compute$1.20-$2.50/hrN/AAvailable
FluidStack$1.20-$2.50/hrN/AAvailable
Paperspace (DigitalOcean)$1.20-$2.50/hrN/AAvailable

Benchmarks

The NVIDIA A100 PCIe delivers 624 TFLOPS FP16 and 312 TFLOPS FP32 performance with 2.0 TB/s memory bandwidth.

Best Use Cases

The NVIDIA A100 PCIe is optimized for Cost-effective ML workloads.

FAQ

{{FAQ_SECTION}}