Data Center
NVIDIA
NVIDIA A100 40GB SXM
Ampere Architecture · 40GB HBM2 · PCIe
VRAM
40GB
FP16
312.0
TDP
400W
Hardware Price
-
MSRP: $10k
Cloud from
$1.29/hr
1 providers
Cheapest at
Lambda Labs →
Quick Insights
Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs Data Center Average
+74% perf
FP16 TFLOPS comparison
Cloud Availability
1 providers
from $1.29/hr
Specifications
| VRAM | 40GB HBM2 |
| Memory Bandwidth | 1.6 TB/s |
| FP16 TFLOPS | 312.0 |
| Tensor TFLOPS | 624.0 |
| FP32 TFLOPS | 19.5 |
| TDP | 400W |
| Form Factor | - |
| Architecture | Ampere |
| NVLink | No |
| Release Date | 2020-05-14 |
Cloud GPU Pricing
Rent NVIDIA A100 40GB SXM from 1 cloud providers. Prices shown per GPU per hour.
| Provider | Type | Instance | GPUs | On-Demand | Per GPU | Spot | Availability |
|---|---|---|---|---|---|---|---|
| Lambda Labs | gpu-cloud | lambda-a100-40gb-sxm | 1x | $1.29/hr | $1.29/hr Cheapest | - | - |
NVIDIA A100 40GB SXM vs Alternatives
Compare NVIDIA A100 40GB SXM with similar GPUs from other brands.
| GPU | VRAM | FP16 TFLOPS | Bandwidth | Hardware Price | Cloud Price | |
|---|---|---|---|---|---|---|
| NVIDIA A100 40GB SXM Current | 40GB | 312.0 | 1.6 TB/s | - | - | - |
| AMD Instinct MI100 AMD | 32GB (-20%) | 184.6 (-41%) | 1.2 TB/s | - | - | Compare |
| AMD Radeon RX 7900 XTX AMD | 24GB (-40%) | 122.0 (-61%) | 960 GB/s | - | - | Compare |
| AMD Radeon RX 7900 XT AMD | 20GB (-50%) | 104.0 (-67%) | 800 GB/s | - | - | Compare |
| AMD Instinct MI210 AMD | 64GB (+60%) | 181.0 (-42%) | 1.6 TB/s | - | - | Compare |
Best Use Cases
No specific use case recommendations for NVIDIA A100 40GB SXM yet.
Browse All Use Cases →Compare NVIDIA A100 40GB SXM
Alternatives
- AMD Instinct MI100 32GB · -
- AMD Radeon RX 7900 XTX 24GB · -
- AMD Radeon RX 7900 XT 20GB · -
- AMD Instinct MI210 64GB · -