HPC/AI Server with Liquid-Cooled G593-ZD2-LAX1
Datacenter
HPC/AI Server with Liquid-Cooled G593-ZD2-LAX1

5U Liquid-Cooled AI Server Optimized for High-Performance AI Training, Inference, and HPC Workloads

  • CPU+GPU Direct liquid cooling solution
  • Liquid-cooled NVIDIA HGX™ H100 8-GPU
  • 900GB/s GPU-to-GPU bandwidth with NVIDIA® NVLink® and NVSwitch™
  • Dual AMD EPYC™ 9005/9004 Series Processors
  • 12-Channel DDR5 RDIMM, 24 x DIMMs
New AMD EPYC™ Family of Processors - Same Great SP5 Platform
New AMD EPYC™ Family of Processors - Same Great SP5 Platform
5th Generation AMD EPYC™ Processors for SP5 Socket

The SP5 socket has reached its pinnacle form with 5th Generation AMD EPYC™ processors. The same successful platform that was shared by AMD EPYC™ 9004 & 8004 series processors is now the same one for AMD EPYC™ 9005 series processors. With up to 192 cores, increased frequencies and cache, this top tier performance platform targets general purpose, cloud native, and technical computing. Building on the advantages of the EPYC™ 9004 series, the new EPYC™ 9005 series adopts the 3nm process with AMD “Zen5” and “Zen5c” core architecture, excelling in both energy efficiency and cost optimization. GIGABYTE has already prepared for this new processor with new servers and updates to existing products for 4th Gen AMD EPYC™ processors.

Supports NVIDIA HGX™ H100 8-GPU
Supports NVIDIA HGX™ H100 8-GPU
High Performance

NVIDIA HGX™ H100 brings together the full power of NVIDIA H100 GPUs and fully optimized NVIDIA AI and NVIDIA HPC software stacks. NVIDIA HGX™ H100 is available as a server building block in the form of integrated baseboards in four or eight H100 GPU configurations. Four GPU HGX H100 offers fully interconnected point to point NVLink™ connections between GPUs while the eight GPU configuration offers full GPU to GPU bandwidth through NVSwitch. Leveraging the power of H100 multi-precision Tensor Cores, an 8-way HGX H100 provides over 32 petaFLOPS of FP8 deep learning compute performance. NVIDIA HGX™ H100 also includes NVIDIA BlueField®-3 data processing units (DPUs) to enable cloud networking, composable storage, zero-trust security, and GPU compute elasticity in hyperscale AI clouds.

Optional TPM 2.0 Module
Optional TPM 2.0 Module
Hardware Security

For hardware-based authentication, the passwords, encryption keys, and digital certificates are stored in a TPM module to prevent unwanted users from gaining access to your data. GIGABYTE TPM modules come in either a Serial Peripheral Interface or Low Pin Count bus.

Automatic Fan Speed Control
Automatic Fan Speed Control
Power Efficiency

GIGABYTE servers are enabled with Automatic Fan Speed Control to achieve the best cooling and power efficiency. Individual fan speeds will be automatically adjusted according to temperature sensors strategically placed in the servers.

Gallery
Test 2
Test 3
Test 4
Origin
Elgota
Best Buy
New Egg
Asus
Noon
Stripe
Amazon Pay
Paypal