Hopper GPU
NVIDIA H100 PCIe 80GB
80GB HBM3 | 1.5 PFLOPS FP16 | 2.0 TB/s bandwidth
From $1.68 /hr
Inference Fine-tuning HPC AI workloads
Specifications
VRAM
80 GB HBM3
FP16
1.5 PFLOPS
FP8
3.0 PFLOPS
FP4
N/A
Memory BW
2.0 TB/s
TDP
350W
Interconnect
PCIe Gen5
Architecture
Hopper
Cloud GPU Pricing (5 offers)
| Provider | Instance Type | vCPUs | RAM | Price/hr | Price/mo | Spot Price | Availability | Action |
|---|---|---|---|---|---|---|---|---|
CoreWeaveCheapest | h100-pcie-1x | 24 | 240 GB | $1.68/hr | $1,226.40/mo | -- | Available | Deploy on CoreWeave |
Vast.ai | h100_pcie_80gb | 12 | 96 GB | $1.80/hr | $1,314.00/mo | $1.35-25% | Available | Rent on Vast.ai |
Lambda Cloud | gpu_1x_h100_pcie | 26 | 200 GB | $1.99/hr | $1,452.70/mo | -- | Available | Deploy on Lambda |
RunPod | h100-pcie-80gb | 16 | 125 GB | $2.09/hr | $1,525.70/mo | $1.46-30.1% | Available | Deploy on RunPod |
Amazon Web Services | p5n.24xlarge | 96 | 768 GB | $2.50/hr | $1,825.00/mo | $1.25-50% | Available | Deploy on AWS |
Last updated 57h ago
Compare With Other GPUs
Frequently Asked Questions
What is the cheapest NVIDIA H100 PCIe 80GB cloud provider?
The cheapest NVIDIA H100 PCIe 80GB is available on CoreWeave at $1.68/hr (h100-pcie-1x).
How much does NVIDIA H100 PCIe 80GB cost per hour?
NVIDIA H100 PCIe 80GB cloud GPU pricing ranges from $1.68/hr to $2.50/hr depending on the provider and configuration.
What are the specs of NVIDIA H100 PCIe 80GB?
NVIDIA H100 PCIe 80GB features 80GB HBM3 memory, 1.5 PFLOPS FP16 performance, 2.0 TB/s memory bandwidth, and 350W TDP. Architecture: Hopper.
Is NVIDIA H100 PCIe 80GB good for AI training?
NVIDIA H100 PCIe 80GB is primarily designed for: Inference, Fine-tuning, HPC, AI workloads. For large-scale training, consider higher-tier GPUs.