Data Center NVIDIA

NVIDIA A100 40GB SXM

Ampere Architecture · 40GB HBM2 · PCIe

VRAM
40GB
FP16
312.0
TDP
400W
Hardware Price
-
MSRP: $10k
Cloud from
$1.29/hr
1 providers
Cheapest at Lambda Labs →

Quick Insights

Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs Data Center Average
+74% perf
FP16 TFLOPS comparison
Cloud Availability
1 providers
from $1.29/hr

Specifications

VRAM 40GB HBM2
Memory Bandwidth 1.6 TB/s
FP16 TFLOPS 312.0
Tensor TFLOPS 624.0
FP32 TFLOPS 19.5
TDP 400W
Form Factor -
Architecture Ampere
NVLink No
Release Date 2020-05-14

Cloud GPU Pricing

Rent NVIDIA A100 40GB SXM from 1 cloud providers. Prices shown per GPU per hour.

Provider Type Instance GPUs On-Demand Per GPU Spot Availability
Lambda Labs gpu-cloud lambda-a100-40gb-sxm 1x $1.29/hr $1.29/hr Cheapest - -

NVIDIA A100 40GB SXM vs Alternatives

Compare NVIDIA A100 40GB SXM with similar GPUs from other brands.

GPU VRAM FP16 TFLOPS Bandwidth Hardware Price Cloud Price
NVIDIA A100 40GB SXM Current 40GB 312.0 1.6 TB/s - - -
AMD Instinct MI100 AMD 32GB (-20%) 184.6 (-41%) 1.2 TB/s - - Compare
AMD Radeon RX 7900 XTX AMD 24GB (-40%) 122.0 (-61%) 960 GB/s - - Compare
AMD Radeon RX 7900 XT AMD 20GB (-50%) 104.0 (-67%) 800 GB/s - - Compare
AMD Instinct MI210 AMD 64GB (+60%) 181.0 (-42%) 1.6 TB/s - - Compare

Best Use Cases

No specific use case recommendations for NVIDIA A100 40GB SXM yet.

Browse All Use Cases →

Compare NVIDIA A100 40GB SXM

Other NVIDIA GPUs

Frequently Asked Questions about NVIDIA A100 40GB SXM

Pricing for NVIDIA A100 40GB SXM varies. Check our cloud pricing section for rental options starting at $1.29/hr.

Yes, the NVIDIA A100 40GB SXM with 40GB VRAM is suitable for many AI/ML workloads. For large language models, you may need multiple GPUs or consider higher-VRAM options like A100 or H100.

Consider buying for long-term, heavy usage (>4 hrs/day). Rent from cloud providers for short-term projects, experimentation, or when you need to scale quickly.

With 40GB VRAM and 312.0 FP16 TFLOPS, the NVIDIA A100 40GB SXM can run: Large language models (7B-13B), Stable Diffusion XL, video AI, and professional 3D rendering.

The NVIDIA A100 40GB SXM offers 40GB VRAM and 312.0 FP16 performance at its price point. Compare with similar GPUs using our comparison tool above. Key factors: VRAM for model size, TFLOPS for speed, and price for budget.