Data Center
NVIDIA
NVIDIA A100 80GB PCIe
Ampere Architecture · 80GB HBM2E · PCIe
VRAM
80GB
FP16
312.0
TDP
300W
Hardware Price
-
MSRP: $11k
Cloud from
-
0 providers
Quick Insights
Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs Data Center Average
+74% perf
FP16 TFLOPS comparison
Cloud Availability
Not available
Check providers
Specifications
| VRAM | 80GB HBM2E |
| Memory Bandwidth | 2.0 TB/s |
| FP16 TFLOPS | 312.0 |
| Tensor TFLOPS | 624.0 |
| FP32 TFLOPS | 19.5 |
| TDP | 300W |
| Form Factor | - |
| Architecture | Ampere |
| NVLink | No |
| Release Date | 2021-06-28 |
Cloud GPU Pricing
No cloud pricing data available for NVIDIA A100 80GB PCIe yet.
Browse All Providers →NVIDIA A100 80GB PCIe vs Alternatives
Compare NVIDIA A100 80GB PCIe with similar GPUs from other brands.
| GPU | VRAM | FP16 TFLOPS | Bandwidth | Hardware Price | Cloud Price | |
|---|---|---|---|---|---|---|
| NVIDIA A100 80GB PCIe Current | 80GB | 312.0 | 2.0 TB/s | - | - | - |
| AMD Instinct MI210 AMD | 64GB (-20%) | 181.0 (-42%) | 1.6 TB/s | - | - | Compare |
| MI300 AMD | 128GB (+60%) | 490.3 (+57%) | 5.3 TB/s | $15k | - | Compare |
| AMD Instinct MI300A AMD | 128GB (+60%) | 980.0 (+214%) | 5.3 TB/s | - | - | Compare |
| AMD Instinct MI250X AMD | 128GB (+60%) | 383.0 (+23%) | 3.3 TB/s | - | - | Compare |
| AMD Instinct MI250 AMD | 128GB (+60%) | 362.0 (+16%) | 3.3 TB/s | - | - | Compare |
Best Use Cases
No specific use case recommendations for NVIDIA A100 80GB PCIe yet.
Browse All Use Cases →Compare NVIDIA A100 80GB PCIe
Alternatives
- AMD Instinct MI210 64GB · -
- MI300 128GB · $15k
- AMD Instinct MI300A 128GB · -
- AMD Instinct MI250X 128GB · -
- AMD Instinct MI250 128GB · -