Data Center
AMD
AMD Instinct MI100
CDNA Architecture · 32GB HBM2 · PCIe
VRAM
32GB
FP16
184.6
TDP
300W
Hardware Price
-
MSRP: $5.0k
Cloud from
-
0 providers
Quick Insights
Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs Data Center Average
+3% perf
FP16 TFLOPS comparison
Cloud Availability
Not available
Check providers
Specifications
| VRAM | 32GB HBM2 |
| Memory Bandwidth | 1.2 TB/s |
| FP16 TFLOPS | 184.6 |
| Tensor TFLOPS | 184.6 |
| FP32 TFLOPS | 23.1 |
| TDP | 300W |
| Form Factor | - |
| Architecture | CDNA |
| NVLink | No |
| Release Date | 2020-11-16 |
Cloud GPU Pricing
No cloud pricing data available for AMD Instinct MI100 yet.
Browse All Providers →AMD Instinct MI100 vs Alternatives
Compare AMD Instinct MI100 with similar GPUs from other brands.
| GPU | VRAM | FP16 TFLOPS | Bandwidth | Hardware Price | Cloud Price | |
|---|---|---|---|---|---|---|
| AMD Instinct MI100 Current | 32GB | 184.6 | 1.2 TB/s | - | - | - |
| V100 NVIDIA | 32GB (+0%) | 31.4 (-83%) | 900 GB/s | $2.5k | $0.140/hr | Compare |
| RTX 5090 NVIDIA | 32GB (+0%) | - | - | - | $0.890/hr | Compare |
| RTX 5000 Ada NVIDIA | 32GB (+0%) | - | - | - | $0.830/hr | Compare |
| V100 SXM2 32GB NVIDIA | 32GB (+0%) | - | - | - | $0.490/hr | Compare |
| A100 40GB NVIDIA | 40GB (+25%) | 78.0 (-58%) | 1.6 TB/s | $8.0k | $0.720/hr | Compare |
Best Use Cases
No specific use case recommendations for AMD Instinct MI100 yet.
Browse All Use Cases →Compare AMD Instinct MI100
Other AMD GPUs
- MI355X 288GB · $25k
- MI300X 192GB · $18k
- MI300 128GB · $15k
- AMD Radeon RX 7900 XT 20GB · -
- AMD Radeon RX 7900 XTX 24GB · -
Alternatives
- V100 32GB · $2.5k
- RTX 5090 32GB · -
- RTX 5000 Ada 32GB · -
- V100 SXM2 32GB 32GB · -
- A100 40GB 40GB · $8.0k