Kioxia GP Series Blurs Line Between SSD and Memory
Ulaş Doğru
Kioxia unveiled the GP Series SSD, which uses Storage Class Memory to let GPUs access much larger pools of persistent storage with extremely high IOPS, targeting demanding AI and HPC workloads. The approach aims to bridge the gap between conventional NAND SSDs and system memory to accelerate data-hungry models.
Kioxia has introduced the GP Series — an SSD family that leans on Storage Class Memory (SCM) to widen the memory available to GPUs for high-performance AI and compute-heavy tasks. Rather than acting only as traditional block storage, these drives aim to present a higher-performance, more byte-addressable tier that GPU workflows can tap into, boosting throughput and delivering millions of IOPS.
The GP Series is positioned for environments where models and datasets outgrow conventional GPU memory. By leveraging SCM, Kioxia is trying to reduce the performance cliff between volatile DRAM/HBM and NAND SSDs, offering lower latencies and more consistent I/O for workloads such as large language models, recommender systems, and real-time inferencing.
Practically, this means servers can feed GPUs with larger working sets without constantly shuttling data over slow storage interfaces. Kioxia's messaging focuses on enabling GPUs to access persistent storage with a performance profile closer to memory than to legacy SSDs — a potential benefit when training or serving models that need fast random access to massive datasets.
It’s worth noting that integrating SCM into AI stacks requires software and system-level support. Frameworks, drivers, and orchestration tools need to be aware of the memory-like characteristics of SCM to exploit it fully. Kioxia’s product thus joins a broader industry push where hardware innovations are only half the story; software adaptations will determine real-world gains.
For data center teams and AI engineers keeping an eye on infrastructure efficiency, the GP Series is an interesting step. It doesn’t replace DRAM or HBM, but it could soften the limits imposed by those technologies by giving GPUs faster access to much larger persistent working sets. If the economics and ecosystem support it, SCM-backed SSDs may become a key part of future AI platforms.
Original Source: https://www.techradar.com/pro/did-kioxia-just-unveil-the-fastest-ssd-ever-gp-series-uses-storage-class-memory-to-feed-the-hbm-gpu-with-millions-of-iops
Related News
Comments (0)
✨Leave a Comment
Be the first to comment.