Quick Insights
Performance/Dollar
3.64 TFLOPS/$k
FP16 performance per $1000
VRAM/Dollar
2.9 GB/$k
VRAM per $1000
vs Data Center Average
-43% perf
FP16 TFLOPS comparison
Cloud Availability
3 providers
from $2.39/hr
Specifications
| VRAM | 80GB HBM2e |
| Memory Bandwidth | 2.0 TB/s |
| FP16 TFLOPS | 102.0 |
| Tensor TFLOPS | 1.5k |
| FP32 TFLOPS | 51.0 |
| TDP | 350W |
| Form Factor | PCIe |
| Architecture | Hopper |
| NVLink | No |
| Release Date | 2022-09 |
Buy vs Rent Analysis
Buy Hardware
$28k
- One-time cost, unlimited usage
- Full control over hardware
- Electricity & cooling costs extra
- Depreciation over 2-3 years
Best if using >11715 hours total
Rent Cloud GPU
$2.39/hr
- Pay only for what you use
- No upfront investment
- Scale up/down instantly
- No maintenance required
Best for <11715 hours or variable usage
Breakeven Point
11,715
hours of usage
At $2.39/hr cloud pricing, buying the hardware pays off after 11,715 hours (~488 days or 16.3 months of 24/7 usage).
| Usage | Monthly Cloud Cost | Months to Breakeven |
|---|---|---|
| 100 hrs/month | $239.00 | 118 months |
| 200 hrs/month | $478.00 | 59 months |
| 500 hrs/month | $1.2k | 24 months |
Cloud GPU Pricing
Rent NVIDIA H100 PCIe from 3 cloud providers. Prices shown per GPU per hour.
| Provider | Type | Instance | GPUs | On-Demand | Per GPU | Spot | Availability |
|---|---|---|---|---|---|---|---|
| RunPod | gpu-cloud | NVIDIA H100 PCIe | 1x | $2.39/hr | $2.39/hr Cheapest | $1.25/hr (-48%) | - |
| Lambda Labs | gpu-cloud | lambda-h100-pcie | 1x | $2.49/hr | $2.49/hr | - | - |
| CoreWeave | gpu-cloud | coreweave-h100-pcie | 1x | $4.25/hr | $4.25/hr | - | - |
Best Spot Deal:
RunPod
offers spot pricing at $1.25/hr
(48% off on-demand).
H100 PCIe vs Alternatives
Compare NVIDIA H100 PCIe with similar GPUs from other brands.
| GPU | VRAM | FP16 TFLOPS | Bandwidth | Hardware Price | Cloud Price | |
|---|---|---|---|---|---|---|
| H100 PCIe Current | 80GB | 102.0 | 2.0 TB/s | $28k | - | - |
| AMD Instinct MI210 AMD | 64GB (-20%) | 181.0 (+77%) | 1.6 TB/s | - | - | Compare |
| MI300 AMD | 128GB (+60%) | 490.3 (+381%) | 5.3 TB/s | $15k (-46%) | - | Compare |
| AMD Instinct MI300A AMD | 128GB (+60%) | 980.0 (+861%) | 5.3 TB/s | - | - | Compare |
| AMD Instinct MI250X AMD | 128GB (+60%) | 383.0 (+275%) | 3.3 TB/s | - | - | Compare |
| AMD Instinct MI250 AMD | 128GB (+60%) | 362.0 (+255%) | 3.3 TB/s | - | - | Compare |
Best Use Cases
No specific use case recommendations for NVIDIA H100 PCIe yet.
Browse All Use Cases →Compare H100 PCIe
Alternatives
- AMD Instinct MI210 64GB · -
- MI300 128GB · $15k
- AMD Instinct MI300A 128GB · -
- AMD Instinct MI250X 128GB · -
- AMD Instinct MI250 128GB · -