A10G

Architecture · 24GB · PCIe

VRAM
24GB
FP16
-
TDP
-
Hardware Price
-
Cloud from
$1.01/hr
8 providers

Quick Insights

Performance/Dollar
N/A
FP16 performance per $1000
VRAM/Dollar
N/A
VRAM per $1000
vs null Average
N/A
FP16 TFLOPS comparison
Cloud Availability
8 providers
from $1.01/hr

Specifications

VRAM 24GB
Memory Bandwidth -
FP16 TFLOPS -
Tensor TFLOPS -
FP32 TFLOPS -
TDP -
Form Factor -
Architecture -
NVLink No
Release Date -

Cloud GPU Pricing

Rent A10G from 8 cloud providers. Prices shown per GPU per hour.

Provider Type Instance GPUs On-Demand Per GPU Spot Availability
Amazon Web Services hyperscaler g5.xlarge 1x $1.01/hr $1.01/hr Cheapest $0.501/hr (-50%) -
Amazon Web Services hyperscaler g5.2xlarge 1x $1.21/hr $1.21/hr $0.595/hr (-51%) -
Amazon Web Services hyperscaler g5.12xlarge 4x $5.67/hr $1.42/hr $2.47/hr (-56%) -
Amazon Web Services hyperscaler g5.4xlarge 1x $1.62/hr $1.62/hr $0.715/hr (-56%) -
Amazon Web Services hyperscaler g5.24xlarge 4x $8.14/hr $2.04/hr $3.96/hr (-51%) -
Amazon Web Services hyperscaler g5.48xlarge 8x $16.29/hr $2.04/hr $6.50/hr (-60%) -
Amazon Web Services hyperscaler g5.8xlarge 1x $2.45/hr $2.45/hr $1.28/hr (-48%) -
Amazon Web Services hyperscaler g5.16xlarge 1x $4.10/hr $4.10/hr $1.75/hr (-57%) -
Best Spot Deal: Amazon Web Services offers spot pricing at $6.50/hr (60% off on-demand).

A10G vs Alternatives

Compare A10G with similar GPUs from other brands.

GPU VRAM FP16 TFLOPS Bandwidth Hardware Price Cloud Price
A10G Current 24GB - - - - -
AMD Radeon RX 7900 XTX AMD 24GB (+0%) 122.0 960 GB/s - - Compare
AMD Radeon RX 7900 XT AMD 20GB (-17%) 104.0 800 GB/s - - Compare
AMD Instinct MI100 AMD 32GB (+33%) 184.6 1.2 TB/s - - Compare

Best Use Cases

No specific use case recommendations for A10G yet.

Browse All Use Cases →

Compare A10G

Other NVIDIA GPUs

Frequently Asked Questions about A10G

Pricing for A10G varies. Check our cloud pricing section for rental options starting at $1.01/hr.

Yes, the A10G with 24GB VRAM is suitable for many AI/ML workloads. For large language models, you may need multiple GPUs or consider higher-VRAM options like A100 or H100.

Consider buying for long-term, heavy usage (>4 hrs/day). Rent from cloud providers for short-term projects, experimentation, or when you need to scale quickly.

With 24GB VRAM and - FP16 TFLOPS, the A10G can run: Large language models (7B-13B), Stable Diffusion XL, video AI, and professional 3D rendering.

The A10G offers 24GB VRAM and - FP16 performance at its price point. Compare with similar GPUs using our comparison tool above. Key factors: VRAM for model size, TFLOPS for speed, and price for budget.