Войти
  • 172299Просмотров
  • 5 месяцев назадОпубликованоServeTheHome

NEW NVIDIA PCIe GPUs for AI and their Systems ft Supermicro

Since we receive many questions about how NVIDIA's PCIe GPUs for AI fit, we visited Supermicro, collected several GPUs, and decided to show which GPUs go in which systems. Also, importantly, we go into why you might choose a NVIDIA L4 versus a NVIDIA L40S or a NVIDIA H200 NVL or a NVIDIA RTX PRO 6000 Blackwell. We have to thank Supermicro and NVIDIA for getting this all together to make this video possible. As such, we also need to say this is sponsored. STH Main Site Article: Substack: STH Top 5 Weekly Newsletter: ---------------------------------------------------------------------- Become a STH YT Member and Support Us ---------------------------------------------------------------------- Join STH YouTube membership to support the channel: Professional Users Substack: ---------------------------------------------------------------------- Where to Find STH ---------------------------------------------------------------------- STH Forums: Follow on Twitter: Follow on LinkedIn: Follow on Facebook: Follow on Instagram: ---------------------------------------------------------------------- Other STH Content Mentioned in this Video ---------------------------------------------------------------------- ---------------------------------------------------------------------- Timestamps ---------------------------------------------------------------------- 00:00 Introduction to PCIe GPUs and Systems 03:00 8x GPU PCIe Systems with NVIDIA H200 NVL and NVIDIA RTX PRO 6000 Blackwell 09:25 NVIDIA MGX PCIe Switch Board with ConnectX-8 11:55 Supermicro 2U NVIDIA MGX-Style Server 13:14 NVIDIA GPUs in Supermicro Hyper 2U Compute Servers 14:35 Supermicro Workstation with NVIDIA RTX PRO 6000 Blackwell 14:58 NVIDIA GPUs in Supermicro SuperBlade High-Density Servers 16:06 Edge AI Inference Example with the NVIDIA L4 17:46 Wrap-up