NVIDIA A100 SXM: Specs, Pricing & Cloud Availability

Last updated: 2026-03-22

Technical Specifications

ArchitectureAmpere
VRAM80 GB HBM2e
Memory Bandwidth2.0 TB/s
FP16 Performance624 TFLOPS
FP8 Performance624 TFLOPS
TDP400W
InterconnectNVLink 3.0

Cloud Pricing

ProviderOn-Demand $/hrSpot $/hrAvailability
Microsoft Azure$3.67/hrN/AAvailable
RunPod$1.99/hrN/AAvailable
Lambda Labs$1.29/hrN/AAvailable
CoreWeave$2.21/hrN/AAvailable
Together AI$2.00/hrN/AAvailable
Vast.ai$1.60/hrN/AAvailable
Vultr$2.85/hrN/AAvailable
Oracle Cloud (OCI)$3.50/hrN/AAvailable
Cudo Compute$1.80/hrN/AAvailable
FluidStack$1.75/hrN/AAvailable
Paperspace (DigitalOcean)$3.18/hrN/AAvailable

Benchmarks

The NVIDIA A100 SXM delivers 624 TFLOPS FP16 and 312 TFLOPS FP32 performance with 2.0 TB/s memory bandwidth.

Best Use Cases

The NVIDIA A100 SXM is optimized for Production ML training and inference.

FAQ

{{FAQ_SECTION}}