Page 35 - FCW, November/December 2020
P. 35

Computational Storage
Los Alamos National Laboratory
Given its mission of using science to solve national security challenges, it’s no surprise that Los Alamos National Laboratory (LANL) wants to optimize computational storage by bringing processing power closer to storage devices or even into the storage system itself.
Complex software stacks for scientific storage
can’t take advantage of emerging high-speed storage devices because of bottlenecks and insufficient compute resources. To address that challenge, LANL developed software elements that use computational storage to achieve orders-of-magnitude performance improvements in data analysis.
“Memory bottlenecks make it impossible to perform critical storage system tasks,” said Brad Settlemyer, a senior scientist at LANL. With computational storage devices, the lab can “offload storage functions onto accelerators, enabling line-rate compression, improving CPU utilization and reducing memory bandwidth pressure.”
Major efforts include extending the ZFS file system
to use specialized devices that support the NVMe computational storage standard proposed by the Storage Networking Industry Association. A year ago, LANL began collaborating with Eideticom, a computational storage solutions provider, to create a high-performance Lustre/ZFS-based parallel file system. LANL made this branch publicly available and used it to deploy the world’s first storage system with compression enabled by computational storage.
The lab also developed a processing pipeline for computational storage that enables scientists to attach high-value data indexes to scientific data, dramatically increasing the rate of scientific discovery.
Finally, LANL launched the Efficient Mission Centric Computing Consortium last year, and more than a dozen companies, universities and federal agencies joined it to explore ways to make supercomputers more efficient, including through the use of computational storage.
November/December 2020
FCW.COM 35


































































































   33   34   35   36   37