H100 80GB

Architecture · 640GB · PCIe

VRAM
640GB
FP16
-
TDP
-
Hardware Price
-
Cloud from
$12.29/hr
1 providers

Quick Insights

Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs null Average
N/A
FP16 TFLOPS comparison
Cloud Availability
1 providers
from $12.29/hr

Specifications

VRAM 640GB
Memory Bandwidth -
FP16 TFLOPS -
Tensor TFLOPS -
FP32 TFLOPS -
TDP -
Form Factor -
Architecture -
NVLink No
Release Date -

Cloud GPU Pricing

Rent H100 80GB from 1 cloud providers. Prices shown per GPU per hour.

Provider Type Instance GPUs On-Demand Per GPU Spot Availability
Amazon Web Services hyperscaler p5.48xlarge 8x $98.32/hr $12.29/hr Cheapest $57.76/hr (-41%) -
Best Spot Deal: Amazon Web Services offers spot pricing at $57.76/hr (41% off on-demand).

Best Use Cases

No specific use case recommendations for H100 80GB yet.

Browse All Use Cases →

Compare H100 80GB

Other NVIDIA GPUs

Frequently Asked Questions about H100 80GB

Pricing for H100 80GB varies. Check our cloud pricing section for rental options starting at $12.29/hr.

Yes, the H100 80GB with 640GB VRAM is suitable for many AI/ML workloads. For large language models, you may need multiple GPUs or consider higher-VRAM options like A100 or H100.

Consider buying for long-term, heavy usage (>4 hrs/day). Rent from cloud providers for short-term projects, experimentation, or when you need to scale quickly.

With 640GB VRAM and - FP16 TFLOPS, the H100 80GB can run: Large language models (7B-13B), Stable Diffusion XL, video AI, and professional 3D rendering.

The H100 80GB offers 640GB VRAM and - FP16 performance at its price point. Compare with similar GPUs using our comparison tool above. Key factors: VRAM for model size, TFLOPS for speed, and price for budget.