GPU Prices 2026

Compare hardware prices for 73 GPUs from NVIDIA and AMD. Find MSRP and current market prices for data center GPUs (H100, A100, H200) and consumer GPUs (RTX 4090, RTX 4080).

Updated Apr 23 73 GPUs

Price Insights

๐Ÿ’ฐ
Best Value ($/TFLOPS)
RTX 4080
$$900 ยท 97.5 TFLOPS
๐Ÿข
Cheapest Data Center
V100
$$2.5k ยท 32GB
๐Ÿง 
Best $/GB VRAM
RTX 3090
$33/GB ยท 24GB
๐Ÿ“Š
Most VRAM
H200 141GB
1128GB HBM

GPU by Budget

๐ŸŽฎ Under $1,000 3 GPUs

Entry-level AI, gaming, Stable Diffusion

โšก $1,000 - $5,000 6 GPUs

Serious AI development, fine-tuning

๐Ÿš€ $5,000 - $15,000 4 GPUs

Professional AI training, multi-GPU

๐Ÿข $15,000+ 8 GPUs

Enterprise, large model training

Data Center GPUs 22 GPUs

For AI training, inference, and enterprise workloads

GPU Architecture VRAM Bandwidth TFLOPS TDP MSRP Market Price
B200 Popular
SXM
Blackwell 192GB
HBM3e
8.0 TB/s 4.5k 1000W $$40k $$45k
+13%
H200 Popular
SXM
Hopper 141GB
HBM3e
4.8 TB/s 2.0k 700W $$35k $$38k
MI355X Popular
OAM
CDNA4 288GB
HBM3e
8.0 TB/s - 500W - $$25k
H100 SXM Popular
SXM
Hopper 80GB
HBM3
3.4 TB/s 2.0k 700W $$30k $$32k
H100 PCIe Popular
PCIe
Hopper 80GB
HBM2e
2.0 TB/s 1.5k 350W $$25k $$28k
+12%
H100 NVL
NVL
Hopper 94GB
HBM3
3.9 TB/s 2.0k 400W - $$35k
MI300X Popular
OAM
CDNA3 192GB
HBM3
5.3 TB/s 653.7 750W $$15k $$18k
+20%
MI300
OAM
CDNA3 128GB
HBM3
5.3 TB/s 490.3 750W - $$15k
A100 80GB Popular
SXM
Ampere 80GB
HBM2e
2.0 TB/s 312.0 400W $$15k $$12k
-20%
A100 40GB
PCIe
Ampere 40GB
HBM2e
1.6 TB/s 312.0 250W $$10k $$8.0k
-20%
L40S Popular
PCIe
Ada Lovelace 48GB
GDDR6
864 GB/s 733.0 350W $$8.0k $$9.0k
+13%
L4
PCIe
Ada Lovelace 24GB
GDDR6
300 GB/s 242.0 72W $$2.5k $$2.8k
+12%
A40
PCIe
Ampere 48GB
GDDR6
696 GB/s 150.0 300W $$5.0k $$4.0k
-20%
V100
SXM
Volta 32GB
HBM2
900 GB/s 125.0 300W - $$2.5k
AMD Instinct MI300A CDNA 3 128GB
HBM3
5.3 TB/s 2.0k 760W $$12k -
AMD Instinct MI250X CDNA 2 128GB
HBM2E
3.3 TB/s 766.0 560W $$12k -
AMD Instinct MI250 CDNA 2 128GB
HBM2E
3.3 TB/s 724.0 500W $$10k -
AMD Instinct MI210 CDNA 2 64GB
HBM2E
1.6 TB/s 362.0 300W $$6.0k -
AMD Instinct MI100 CDNA 32GB
HBM2
1.2 TB/s 184.6 300W $$5.0k -
NVIDIA A100 80GB PCIe Ampere 80GB
HBM2E
2.0 TB/s 624.0 300W $$11k -
NVIDIA A100 40GB SXM Ampere 40GB
HBM2
1.6 TB/s 624.0 400W $$10k -
NVIDIA V100 16GB Volta 16GB
HBM2
900 GB/s 125.0 300W $$6.0k -

Consumer GPUs 11 GPUs

For gaming, content creation, and personal AI projects

GPU Architecture VRAM Bandwidth TFLOPS TDP MSRP Market Price
RTX 4090 Popular
PCIe
Ada Lovelace 24GB
GDDR6X
1.0 TB/s 165.2 450W $$1.6k $$1.8k
+13%
RTX 4080 Super Popular
PCIe
Ada Lovelace 16GB
GDDR6X
736 GB/s 104.4 320W $$999 $$1.1k
RTX 4080
PCIe
Ada Lovelace 16GB
GDDR6X
717 GB/s 97.5 320W $$1.2k $$900
-25%
RTX 3090
PCIe
Ampere 24GB
GDDR6X
936 GB/s 71.2 350W $$1.5k $$800
-47%
AMD Radeon RX 7900 XTX RDNA 3 24GB
GDDR6
960 GB/s 122.0 355W $$999 -
AMD Radeon RX 7900 XT RDNA 3 20GB
GDDR6
800 GB/s 104.0 315W $$899 -
NVIDIA GeForce RTX 4070 Ti Super Ada Lovelace 16GB
GDDR6X
672 GB/s 353.0 285W $$799 -
NVIDIA GeForce RTX 4070 Ti Ada Lovelace 12GB
GDDR6X
504 GB/s 321.0 285W $$799 -
NVIDIA GeForce RTX 3090 Ti Ampere 24GB
GDDR6X
1.0 TB/s 320.0 450W $$2.0k -
NVIDIA GeForce RTX 3080 Ti Ampere 12GB
GDDR6X
912 GB/s 273.0 350W $$1.2k -
NVIDIA GeForce RTX 3080 Ampere 10GB
GDDR6X
760 GB/s 238.0 320W $$699 -

Workstation GPUs 3 GPUs

For professional visualization and rendering

GPU Architecture VRAM Bandwidth TFLOPS TDP MSRP Market Price
RTX 6000 Ada
PCIe
Ada Lovelace 48GB
GDDR6
960 GB/s 182.2 300W $$6.8k $$7.0k
RTX A6000
PCIe
Ampere 48GB
GDDR6
768 GB/s 77.4 300W $$4.7k $$3.5k
-25%
RTX A4000
PCIe
Ampere 16GB
GDDR6
448 GB/s 38.4 140W $$1.0k $$900

GPU Buying Guide

For AI Training

Large model training requires high VRAM (80GB+) and fast interconnects. Consider:

For AI Inference

Inference prioritizes throughput and cost efficiency. Good options:

  • L40S - Optimized for inference
  • A10 - Cost-effective choice
  • RTX 4090 - Best consumer option

For Stable Diffusion

Image generation needs 12GB+ VRAM. Budget-friendly picks:

For Gaming + AI

Want gaming and AI capabilities? Consumer GPUs work great:

GPU Pricing FAQ

GPU prices remain elevated due to strong AI demand, limited supply of advanced chips (especially HBM memory), and the complexity of manufacturing cutting-edge GPUs. Data center GPUs like H100 command premium prices because they're essential for training large AI models.

MSRP (Manufacturer's Suggested Retail Price) is the official price set by NVIDIA or AMD. Market price reflects actual selling prices, which can be higher due to demand or lower for older models. For popular GPUs like H100, market prices often exceed MSRP significantly.

It depends on your usage. If you need GPUs for <1000 hours/year, cloud rental is usually cheaper. For continuous workloads, buying makes sense. Consider: H100 costs ~$30,000 but rents for ~$3/hour. Break-even is around 10,000 hours of usage.

For AI workloads, the RTX 4090 offers excellent value at ~$1,600 with 24GB VRAM. For data center use, the A100 40GB provides good performance per dollar. The 'best' depends on your specific workload and VRAM requirements.

GPU prices typically drop when new generations launch. NVIDIA's Blackwell GPUs launched in 2025, which has started to reduce Hopper (H100) prices. Consumer GPUs often see discounts during Black Friday and holiday sales.

Used data center GPUs can offer significant savings (30-50% off). However, verify the GPU's history - mining GPUs may have reduced lifespan. For enterprise use, consider certified refurbished options with warranty.