Data Center NVIDIA

NVIDIA A100 40GB PCIe

Ampere Architecture · 40GB HBM2e · PCIe

VRAM
40GB
FP16
78.0
TDP
250W
Hardware Price
$8.0k
MSRP: $10k
Cloud from
$0.720/hr
6 providers

Quick Insights

Performance/Dollar
9.75 TFLOPS/$k
FP16 performance per $1000
VRAM/Dollar
5.0 GB/$k
VRAM per $1000
vs Data Center Average
-57% perf
FP16 TFLOPS comparison
Cloud Availability
6 providers
from $0.720/hr

Specifications

VRAM 40GB HBM2e
Memory Bandwidth 1.6 TB/s
FP16 TFLOPS 78.0
Tensor TFLOPS 312.0
FP32 TFLOPS 19.5
TDP 250W
Form Factor PCIe
Architecture Ampere
NVLink No
Release Date 2020-05

Buy vs Rent Analysis

Buy Hardware
$8.0k
  • One-time cost, unlimited usage
  • Full control over hardware
  • Electricity & cooling costs extra
  • Depreciation over 2-3 years
Best if using >11111 hours total
Rent Cloud GPU
$0.720/hr
  • Pay only for what you use
  • No upfront investment
  • Scale up/down instantly
  • No maintenance required
Best for <11111 hours or variable usage
Breakeven Point
11,111
hours of usage

At $0.720/hr cloud pricing, buying the hardware pays off after 11,111 hours (~463 days or 15.4 months of 24/7 usage).

Usage Monthly Cloud Cost Months to Breakeven
100 hrs/month $72.00 112 months
200 hrs/month $144.00 56 months
500 hrs/month $360.00 23 months

Cloud GPU Pricing

Rent NVIDIA A100 40GB PCIe from 6 cloud providers. Prices shown per GPU per hour.

Provider Type Instance GPUs On-Demand Per GPU Spot Availability
Datacrunch (Verda) gpu-cloud datacrunch-a100-40gb-pcie 1x $0.720/hr $0.720/hr Cheapest - -
Oracle Cloud hyperscaler oracle-gpu-a100 1x $1.00/hr $1.00/hr - -
CoreWeave gpu-cloud coreweave-a100-40gb-pcie 1x $1.21/hr $1.21/hr - -
Lambda Labs gpu-cloud lambda-a100-40gb-pcie 1x $1.29/hr $1.29/hr - -
Fluidstack gpu-cloud fluidstack-a100-40gb-pcie 1x $1.80/hr $1.80/hr - -
Google Cloud Platform hyperscaler gcp-a100 1x $2.93/hr $2.93/hr $1.15/hr (-61%) -
Best Spot Deal: Google Cloud Platform offers spot pricing at $1.15/hr (61% off on-demand).

A100 40GB vs Alternatives

Compare NVIDIA A100 40GB PCIe with similar GPUs from other brands.

GPU VRAM FP16 TFLOPS Bandwidth Hardware Price Cloud Price
A100 40GB Current 40GB 78.0 1.6 TB/s $8.0k - -
AMD Instinct MI100 AMD 32GB (-20%) 184.6 (+137%) 1.2 TB/s - - Compare
AMD Radeon RX 7900 XTX AMD 24GB (-40%) 122.0 (+56%) 960 GB/s - - Compare
AMD Radeon RX 7900 XT AMD 20GB (-50%) 104.0 (+33%) 800 GB/s - - Compare
AMD Instinct MI210 AMD 64GB (+60%) 181.0 (+132%) 1.6 TB/s - - Compare

Best Use Cases

No specific use case recommendations for NVIDIA A100 40GB PCIe yet.

Browse All Use Cases →

Compare A100 40GB

Other NVIDIA GPUs

Frequently Asked Questions about A100 40GB

The NVIDIA A100 40GB PCIe has a market price of approximately $8.0k (MSRP: $10k). Cloud rental starts at $0.720/hr. Prices may vary based on retailer, region, and availability.

Yes, the NVIDIA A100 40GB PCIe with 40GB VRAM is suitable for many AI/ML workloads. For large language models, you may need multiple GPUs or consider higher-VRAM options like A100 or H100.

The breakeven point is approximately 11,111 hours of usage. Buy if you'll use it more than this; rent for shorter projects or variable workloads. Cloud rental from Datacrunch (Verda) starts at $0.720/hr.

With 40GB VRAM and 78.0 FP16 TFLOPS, the NVIDIA A100 40GB PCIe can run: Large language models (7B-13B), Stable Diffusion XL, video AI, and professional 3D rendering.

The NVIDIA A100 40GB PCIe offers 40GB VRAM and 78.0 FP16 performance at $8.0k. Compare with similar GPUs using our comparison tool above. Key factors: VRAM for model size, TFLOPS for speed, and price for budget.