NVIDIA has announced the Magnum IO software suite, which is designed to help data scientists and AI and high-performance computing researchers process huge amounts of data very quickly. Magnum IO has been optimized to eliminate storage and input/output bottlenecks, and is quoted with data processing speeds that are twenty (20) times faster for multi-server, multi-GPU computing nodes when working with massive datasets. This will allow organizations to perform complex financial analysis, climate modeling and other HPC workloads.
NVIDIA has announced the Magnum IO software suite, which is designed to help data scientists and AI and high-performance computing researchers process huge amounts of data very quickly. Magnum IO has been optimized to eliminate storage and input/output bottlenecks, and is quoted with data processing speeds that are twenty (20) times faster for multi-server, multi-GPU computing nodes when working with massive datasets. This will allow organizations to perform complex financial analysis, climate modeling and other HPC workloads.
NVIDIA partnered with industry leaders in networking and storage to develop Magnum IO, including DataDirect Networks, Excelero, IBM, Mellanox and WekaIO. This software suite release is also highlighted by GPUDirect Storage, which gives researchers the ability to bypass CPUs when accessing storage for quick access to data files in applications such as simulation, analysis or visualization.
Availability
NVIDIA Magnum IO software is available now, though GPUDirect Storage is currently available to only a select number of early-access customers with a broader release slated for the first half of 2020.
Sign up for the StorageReview newsletter