Quick Insights
Performance/Dollar
3.83 TFLOPS/$k
FP16 performance per $1000
VRAM/Dollar
2.7 GB/$k
VRAM per $1000
vs Data Center Average
-25% perf
FP16 TFLOPS comparison
Cloud Availability
2 providers
from $1.38/hr
Specifications
| VRAM | 94GB HBM3 |
| Memory Bandwidth | 3.9 TB/s |
| FP16 TFLOPS | 134.0 |
| Tensor TFLOPS | 2.0k |
| FP32 TFLOPS | 67.0 |
| TDP | 400W |
| Form Factor | NVL |
| Architecture | Hopper |
| NVLink | Yes (900GB/s) |
| Release Date | 2023-03 |
Buy vs Rent Analysis
Buy Hardware
$35k
- One-time cost, unlimited usage
- Full control over hardware
- Electricity & cooling costs extra
- Depreciation over 2-3 years
Best if using >25315 hours total
Rent Cloud GPU
$1.38/hr
- Pay only for what you use
- No upfront investment
- Scale up/down instantly
- No maintenance required
Best for <25315 hours or variable usage
Breakeven Point
25,315
hours of usage
At $1.38/hr cloud pricing, buying the hardware pays off after 25,315 hours (~1055 days or 35.2 months of 24/7 usage).
| Usage | Monthly Cloud Cost | Months to Breakeven |
|---|---|---|
| 100 hrs/month | $138.26 | 254 months |
| 200 hrs/month | $276.52 | 127 months |
| 500 hrs/month | $691.30 | 51 months |
Cloud GPU Pricing
Rent NVIDIA H100 NVL from 2 cloud providers. Prices shown per GPU per hour.
| Provider | Type | Instance | GPUs | On-Demand | Per GPU | Spot | Availability |
|---|---|---|---|---|---|---|---|
| Vast.ai | marketplace | vastai-h100-nvl | 1x | $1.38/hr | $1.38/hr Cheapest | $1.07/hr (-23%) | - |
| RunPod | gpu-cloud | NVIDIA H100 NVL | 1x | $3.07/hr | $3.07/hr | $1.65/hr (-46%) | - |
Best Spot Deal:
RunPod
offers spot pricing at $1.65/hr
(46% off on-demand).
H100 NVL vs Alternatives
Compare NVIDIA H100 NVL with similar GPUs from other brands.
| GPU | VRAM | FP16 TFLOPS | Bandwidth | Hardware Price | Cloud Price | |
|---|---|---|---|---|---|---|
| H100 NVL Current | 94GB | 134.0 | 3.9 TB/s | $35k | - | - |
| AMD Instinct MI210 AMD | 64GB (-32%) | 181.0 (+35%) | 1.6 TB/s | - | - | Compare |
| MI300 AMD | 128GB (+36%) | 490.3 (+266%) | 5.3 TB/s | $15k (-57%) | - | Compare |
| AMD Instinct MI300A AMD | 128GB (+36%) | 980.0 (+631%) | 5.3 TB/s | - | - | Compare |
| AMD Instinct MI250X AMD | 128GB (+36%) | 383.0 (+186%) | 3.3 TB/s | - | - | Compare |
| AMD Instinct MI250 AMD | 128GB (+36%) | 362.0 (+170%) | 3.3 TB/s | - | - | Compare |
Best Use Cases
No specific use case recommendations for NVIDIA H100 NVL yet.
Browse All Use Cases →Compare H100 NVL
Alternatives
- AMD Instinct MI210 64GB · -
- MI300 128GB · $15k
- AMD Instinct MI300A 128GB · -
- AMD Instinct MI250X 128GB · -
- AMD Instinct MI250 128GB · -