Data Center AMD

AMD Instinct MI100

CDNA Architecture · 32GB HBM2 · PCIe

VRAM
32GB
FP16
184.6
TDP
300W
Hardware Price
-
MSRP: $5.0k
Cloud from
-
0 providers

Quick Insights

Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs Data Center Average
+3% perf
FP16 TFLOPS comparison
Cloud Availability
Not available
Check providers

Specifications

VRAM 32GB HBM2
Memory Bandwidth 1.2 TB/s
FP16 TFLOPS 184.6
Tensor TFLOPS 184.6
FP32 TFLOPS 23.1
TDP 300W
Form Factor -
Architecture CDNA
NVLink No
Release Date 2020-11-16

Cloud GPU Pricing

No cloud pricing data available for AMD Instinct MI100 yet.

Browse All Providers →

AMD Instinct MI100 vs Alternatives

Compare AMD Instinct MI100 with similar GPUs from other brands.

GPU VRAM FP16 TFLOPS Bandwidth Hardware Price Cloud Price
AMD Instinct MI100 Current 32GB 184.6 1.2 TB/s - - -
V100 NVIDIA 32GB (+0%) 31.4 (-83%) 900 GB/s $2.5k $0.140/hr Compare
RTX 5090 NVIDIA 32GB (+0%) - - - $0.890/hr Compare
RTX 5000 Ada NVIDIA 32GB (+0%) - - - $0.830/hr Compare
V100 SXM2 32GB NVIDIA 32GB (+0%) - - - $0.490/hr Compare
A100 40GB NVIDIA 40GB (+25%) 78.0 (-58%) 1.6 TB/s $8.0k $0.720/hr Compare

Best Use Cases

No specific use case recommendations for AMD Instinct MI100 yet.

Browse All Use Cases →

Compare AMD Instinct MI100

Other AMD GPUs
Alternatives

Frequently Asked Questions about AMD Instinct MI100

Pricing for AMD Instinct MI100 varies. Check our cloud pricing section for rental options starting at various rates.

Yes, the AMD Instinct MI100 with 32GB VRAM is suitable for many AI/ML workloads. For large language models, you may need multiple GPUs or consider higher-VRAM options like A100 or H100.

Consider buying for long-term, heavy usage (>4 hrs/day). Rent from cloud providers for short-term projects, experimentation, or when you need to scale quickly.

With 32GB VRAM and 184.6 FP16 TFLOPS, the AMD Instinct MI100 can run: Large language models (7B-13B), Stable Diffusion XL, video AI, and professional 3D rendering.

The AMD Instinct MI100 offers 32GB VRAM and 184.6 FP16 performance at its price point. Compare with similar GPUs using our comparison tool above. Key factors: VRAM for model size, TFLOPS for speed, and price for budget.