During the Supercomputing 2024 conference, Nvidia (NVDA) announced the availability of the NVIDIA H200 NVL PCIe GPU – the latest addition to the Hopper family. H200 NVL is ideal for organizations with data centers looking for lower-power, air-cooled enterprise rack designs with flexible configurations to deliver acceleration for every AI and HPC workload, regardless of size. Complementing the raw power of the H200 NVL is NVIDIA NVLink technology. The latest generation of NVLink provides GPU-to-GPU communication 7x faster than fifth-generation PCIe – delivering higher performance to meet the needs of HPC, large language model inference and fine-tuning. The NVIDIA H200 NVL is paired with powerful software tools that enable enterprises to accelerate applications from AI to HPC. It comes with a five-year subscription for NVIDIA AI Enterprise, a cloud-native software platform for the development and deployment of production AI. NVIDIA AI Enterprise includes NVIDIA NIM microservices for the secure, reliable deployment of high-performance AI model inference.
Pick the best stocks and maximize your portfolio:
- Discover top-rated stocks from highly ranked analysts with Analyst Top Stocks!
- Easily identify outperforming stocks and invest smarter with Top Smart Score Stocks
Published first on TheFly – the ultimate source for real-time, market-moving breaking financial news. Try Now>>
Read More on NVDA:
- Nvidia CEO Huang says AI will drive scientific breakthroughs
- Nvidia announces NVIDIA Omniverse Blueprint
- Nvidia announces new NIM microservices for climate change modeling simulation
- Intel (NASDAQ:INTC) Readies Battlemage Launch Event For December
- nVent Electric collaborating with Nvidia on AI-ready liquid cooling solutions