Data Center NVIDIA

NVIDIA H100 NVL

Hopper Architecture · 94GB HBM3 · NVL

VRAM
94GB
FP16
134.0
TDP
400W
Hardware Price
$35k
Cloud from
$1.38/hr
2 providers
Cheapest at Vast.ai →

Quick Insights

Performance/Dollar
3.83 TFLOPS/$k
FP16 performance per $1000
VRAM/Dollar
2.7 GB/$k
VRAM per $1000
vs Data Center Average
-25% perf
FP16 TFLOPS comparison
Cloud Availability
2 providers
from $1.38/hr

Specifications

VRAM 94GB HBM3
Memory Bandwidth 3.9 TB/s
FP16 TFLOPS 134.0
Tensor TFLOPS 2.0k
FP32 TFLOPS 67.0
TDP 400W
Form Factor NVL
Architecture Hopper
NVLink Yes (900GB/s)
Release Date 2023-03

Buy vs Rent Analysis

Buy Hardware
$35k
  • One-time cost, unlimited usage
  • Full control over hardware
  • Electricity & cooling costs extra
  • Depreciation over 2-3 years
Best if using >25315 hours total
Rent Cloud GPU
$1.38/hr
  • Pay only for what you use
  • No upfront investment
  • Scale up/down instantly
  • No maintenance required
Best for <25315 hours or variable usage
Breakeven Point
25,315
hours of usage

At $1.38/hr cloud pricing, buying the hardware pays off after 25,315 hours (~1055 days or 35.2 months of 24/7 usage).

Usage Monthly Cloud Cost Months to Breakeven
100 hrs/month $138.26 254 months
200 hrs/month $276.52 127 months
500 hrs/month $691.30 51 months

Cloud GPU Pricing

Rent NVIDIA H100 NVL from 2 cloud providers. Prices shown per GPU per hour.

Provider Type Instance GPUs On-Demand Per GPU Spot Availability
Vast.ai marketplace vastai-h100-nvl 1x $1.38/hr $1.38/hr Cheapest $1.07/hr (-23%) -
RunPod gpu-cloud NVIDIA H100 NVL 1x $3.07/hr $3.07/hr $1.65/hr (-46%) -
Best Spot Deal: RunPod offers spot pricing at $1.65/hr (46% off on-demand).

H100 NVL vs Alternatives

Compare NVIDIA H100 NVL with similar GPUs from other brands.

GPU VRAM FP16 TFLOPS Bandwidth Hardware Price Cloud Price
H100 NVL Current 94GB 134.0 3.9 TB/s $35k - -
AMD Instinct MI210 AMD 64GB (-32%) 181.0 (+35%) 1.6 TB/s - - Compare
MI300 AMD 128GB (+36%) 490.3 (+266%) 5.3 TB/s $15k (-57%) - Compare
AMD Instinct MI300A AMD 128GB (+36%) 980.0 (+631%) 5.3 TB/s - - Compare
AMD Instinct MI250X AMD 128GB (+36%) 383.0 (+186%) 3.3 TB/s - - Compare
AMD Instinct MI250 AMD 128GB (+36%) 362.0 (+170%) 3.3 TB/s - - Compare

Best Use Cases

No specific use case recommendations for NVIDIA H100 NVL yet.

Browse All Use Cases →

Compare H100 NVL

Other NVIDIA GPUs
Alternatives

Frequently Asked Questions about H100 NVL

The NVIDIA H100 NVL has a market price of approximately $35k. Cloud rental starts at $1.38/hr. Prices may vary based on retailer, region, and availability.

Yes, the NVIDIA H100 NVL with 94GB VRAM is suitable for many AI/ML workloads. For large language models, you may need multiple GPUs or consider higher-VRAM options like A100 or H100.

The breakeven point is approximately 25,315 hours of usage. Buy if you'll use it more than this; rent for shorter projects or variable workloads. Cloud rental from Vast.ai starts at $1.38/hr.

With 94GB VRAM and 134.0 FP16 TFLOPS, the NVIDIA H100 NVL can run: Large language models (7B-13B), Stable Diffusion XL, video AI, and professional 3D rendering.

The NVIDIA H100 NVL offers 94GB VRAM and 134.0 FP16 performance at $35k. Compare with similar GPUs using our comparison tool above. Key factors: VRAM for model size, TFLOPS for speed, and price for budget.