SK hynix Inc. and Sandisk Corporation held ‘HBF Spec. Standardization Consortium Kick-Off’ event at Sandisk Headquarters in Milpitas, California on the 25th(local time) announcing global standardization strategy of next-generation memory solution HBF(High Bandwidth Flash) aimed at the AI inference era.
Recently, the AI industry is shifting from training which focuses on creating Large Language Models (LLMs) to inference, which accelerates actual AI services to users.
Fast and efficient memory is essential as the number of users using AI services increases rapidly. However, the existing memory structure cannot meet the high capacity data processing and power efficiency at the same time in the inference stage and HBF technology is designed to address these limitations.
HBF technology is a new memory layer between ultra-fast memory, HBM and high-capacity storage device, SSD. HBF technology can fill the gap between HBM’s high performance and SSD’s high capacity and ensure both capacity expansion and power efficiency required for AI inferencing. While HBM handles the high level bandwidth, HBF technology serves as a supporting layer in the architecture.
In particular, HBF technology is expected to reduce the total cost of ownership (TCO) while increasing the scalability of AI systems. The industry forecasts that the demand of complex memory solutions, including HBF, will pick up around 2030.
In the AI inference market, the role of a total memory solution company that can provide both HBM and HBF is becoming more important as system level optimization of CPU, GPU, and memory determines the overall competitiveness rather than the performance of a single chip.
In line with this, SK hynix and Sandisk are proactively pursuing HBF solution’s standardization and commercialization based on their design, packaging and mass production experience in HBM and NAND.