B200 SXM 192GB vs H200 SXM 141GB

Compare NVIDIA B200 SXM 192GB and NVIDIA H200 SXM 141GB specs, performance, and cloud pricing

B200 SXM 192GB

192GB

From $6.50/hr

H200 SXM 141GB

141GB

From $3.49/hr

Architecture

Blackwell

vs Hopper

FP16 Gap

4.5x

B200 SXM 192GB leads

SpecificationB200 SXM 192GBH200 SXM 141GB
VRAM192 GB141 GB
VRAM TypeHBM3eHBM3e
FP16 TFLOPS4.5 PFLOPS989.5 TFLOPS
FP8 TFLOPS9.0 PFLOPS2.0 PFLOPS
Memory Bandwidth8.0 TB/s4.8 TB/s
TDP1000W700W
InterconnectNVLink 5NVLink 4
ArchitectureBlackwellHopper

Price Comparison

MetricB200 SXM 192GBH200 SXM 141GB
Cheapest On-Demand$6.50/hr$3.49/hr
Cheapest Spot$4.32/hr$2.52/hr
Providers Available44

Verdict

Best for Training

NVIDIA B200 SXM 192GB

4.5 PFLOPS FP16 with 192GB VRAM

Best Value

NVIDIA B200 SXM 192GB

692 TFLOPS per $/hr

Best for Inference

NVIDIA B200 SXM 192GB

9.0 PFLOPS FP8/FP16

Use-Case Recommendations

Large-Scale Training

Training LLMs and large multi-modal models

Winner

B200 SXM 192GB

4.5 PFLOPS FP16 with 192GB HBM3e provides the best training throughput.

Inference at Scale

Deploying models in production for real-time inference

Winner

B200 SXM 192GB

9.0 PFLOPS FP8/FP16 gives superior inference throughput.

Budget-Conscious Workloads

Getting the best performance per dollar

Winner

B200 SXM 192GB

Starting at $6.50/hr delivers the best TFLOPS per dollar.

Learn More