GPU Marketplace

TensorDock GPU Pricing

Available GPUs
9
Instance Types
9
H100 Price
$2.25/hr
A100 Price
N/A

Price Highlights

Cheapest GPU
RTX A4000
$0.060/hr
H100 (per GPU)
$2.25/hr
A100 (per GPU)
Not available
Best Spot Discount
No spot pricing

About TensorDock

Key Features
  • Cloud GPU access
  • On-demand pricing
Best For
  • AI/ML workloads
  • GPU computing
Considerations
  • Check availability
  • Compare pricing

H100 Price Comparison

Compare TensorDock's H100 pricing with other marketplace providers.

Provider H100 Price (per GPU) vs TensorDock
Vast.ai $1.38/hr -39%
TensorDock Current $2.25/hr -

All GPU Pricing

Data Center GPUs 3 instances
GPU VRAM Instance GPUs On-Demand Per GPU Spot Availability
H100 SXM 80GB tensordock-h100-sxm 1x $2.25/hr $2.25/hr - -
A100 80GB 80GB tensordock-a100-80gb-sxm 1x $1.50/hr $1.50/hr - -
V100 32GB tensordock-v100-32gb 1x $0.170/hr $0.170/hr Cheapest - -
Consumer GPUs 2 instances
GPU VRAM Instance GPUs On-Demand Per GPU Spot Availability
RTX 4090 24GB tensordock-rtx-4090 1x $0.350/hr $0.350/hr - -
RTX 3090 24GB tensordock-rtx-3090 1x $0.200/hr $0.200/hr Cheapest - -
Workstation GPUs 3 instances
GPU VRAM Instance GPUs On-Demand Per GPU Spot Availability
RTX 6000 Ada 48GB tensordock-rtx-6000-ada 1x $0.750/hr $0.750/hr - -
RTX A6000 48GB tensordock-rtx-a6000 1x $0.450/hr $0.450/hr - -
RTX A4000 16GB tensordock-rtx-a4000 1x $0.060/hr $0.060/hr Cheapest - -
Other GPUs 1 instances
GPU VRAM Instance GPUs On-Demand Per GPU Spot Availability
L40 48GB tensordock-l40 1x $0.950/hr $0.950/hr Cheapest - -

Similar Providers

Other marketplace providers you might consider.

Frequently Asked Questions about TensorDock

TensorDock charges $2.25/hr per H100 GPU for on-demand instances. Prices may vary based on instance type, region, and commitment length. Check their official pricing page for the most current rates.

TensorDock can be cost-effective for AI/ML training, especially for experimentation and batch processing. However, availability and reliability may vary compared to dedicated GPU clouds.

Yes, TensorDock operates on a marketplace model where pricing is dynamic and often significantly lower than on-demand rates from traditional providers. However, instances may be interrupted with short notice.

TensorDock offers some of the lowest prices in the market through their marketplace model. Trade-offs include variable availability and less guaranteed uptime compared to dedicated providers.

Most cloud GPU providers including TensorDock accept credit cards for pay-as-you-go usage. Enterprise customers may have options for invoicing and purchase orders. Check their website for specific payment methods and billing options.