Cloudian Speeds Up Object Storage Links to GPUs
Object storage provider Cloudian has released a version of its HyperStore product compatible with NVIDIA’s GPUDirect Storage technology, allowing object storage data to feed directly into GPUs via Remote Direct Memory Access (RDMA). The product eliminates the CPU middleman, speeding up AI inferencing and training workloads, Cloudian said.
NVIDIA GPUDirect Storage is part of that vendor’s Magnum IO suite, a series of products designed to improve the flow of data in applications. By integrating GPUDirect Storage with its HyperStore Object Storage system, Cloudian has streamlined the connection between its object-based data lake and GPU memory.
Why This Matters
The news is significant because it streamlines AI processing. Cloudian claims its system reduces CPU utilization by 45% during data transfers and allows for throughput of over 200 Gbytes/second from object storage to GPU.
High throughput rates are vital to linking storage to AI workloads as GPU speeds increase. Unless throughput from storage is sufficient, expensive GPUs may end up being underutilized, a problem that puts return on investment in peril.
To access the rest of this article, you need a Futuriom CLOUD TRACKER PRO subscription — see below.
Access CLOUD TRACKER PRO
|
CLOUD TRACKER PRO Subscribers — Sign In |