A10

Architecture · 24GB · PCIe

VRAM
24GB
FP16
-
TDP
-
Hardware Price
-
Cloud from
$0.454/hr
8 providers

Quick Insights

Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs null Average
N/A
FP16 TFLOPS comparison
Cloud Availability
8 providers
from $0.454/hr

Specifications

VRAM 24GB
Memory Bandwidth -
FP16 TFLOPS -
Tensor TFLOPS -
FP32 TFLOPS -
TDP -
Form Factor -
Architecture -
NVLink No
Release Date -

Cloud GPU Pricing

Rent A10 from 8 cloud providers. Prices shown per GPU per hour.

Provider Type Instance GPUs On-Demand Per GPU Spot Availability
Microsoft Azure hyperscaler Standard_NV6ads_A10_v5 1x $0.454/hr $0.454/hr Cheapest - -
Lambda Labs gpu-cloud lambda-a10 1x $0.750/hr $0.750/hr - -
Microsoft Azure hyperscaler Standard_NV12ads_A10_v5 1x $0.908/hr $0.908/hr - -
Microsoft Azure hyperscaler Standard_NV18ads_A10_v5 1x $1.60/hr $1.60/hr - -
Microsoft Azure hyperscaler Standard_NV36ads_A10_v5 1x $3.20/hr $3.20/hr - -
Microsoft Azure hyperscaler Standard_NV72ads_A10_v5 2x $6.52/hr $3.26/hr - -
Microsoft Azure hyperscaler Standard_NV36adms_A10_v5 1x $4.52/hr $4.52/hr - -
Oracle Cloud hyperscaler oracle-gpu-a10 1x $16.00/hr $16.00/hr - -

A10 vs Alternatives

Compare A10 with similar GPUs from other brands.

GPU VRAM FP16 TFLOPS Bandwidth Hardware Price Cloud Price
A10 Current 24GB - - - - -
AMD Radeon RX 7900 XTX AMD 24GB (+0%) 122.0 960 GB/s - - Compare
AMD Radeon RX 7900 XT AMD 20GB (-17%) 104.0 800 GB/s - - Compare
AMD Instinct MI100 AMD 32GB (+33%) 184.6 1.2 TB/s - - Compare

Best Use Cases

No specific use case recommendations for A10 yet.

Browse All Use Cases →

Compare A10

Other NVIDIA GPUs

Frequently Asked Questions about A10

Pricing for A10 varies. Check our cloud pricing section for rental options starting at $0.454/hr.

Yes, the A10 with 24GB VRAM is suitable for many AI/ML workloads. For large language models, you may need multiple GPUs or consider higher-VRAM options like A100 or H100.

Consider buying for long-term, heavy usage (>4 hrs/day). Rent from cloud providers for short-term projects, experimentation, or when you need to scale quickly.

With 24GB VRAM and - FP16 TFLOPS, the A10 can run: Large language models (7B-13B), Stable Diffusion XL, video AI, and professional 3D rendering.

The A10 offers 24GB VRAM and - FP16 performance at its price point. Compare with similar GPUs using our comparison tool above. Key factors: VRAM for model size, TFLOPS for speed, and price for budget.