TL;DR: NVIDIA is ramping up production of LPDDR-based SOCAMM memory, targeting 600,000 to 800,000 units in 2024 for AI PC and server products. SOCAMM offers superior power efficiency, modular upgrades ...
TL;DR: SK hynix is advancing AI memory technology with its 12-layer HBM4 and upcoming HBM4E, aiming for mass production this year. The company has shipped 12-layer HBM4 samples, boasting ...
Memory startup d-Matrix is claiming its 3D stacked memory will be up to 10x faster and run at up to 10x greater speeds than HBM. d-Matrix's 3D digital in-memory compute (3DIMC) technology is the ...
Elastic Networked-Memory Solution Delivers Multi-800GB/s Read-Write Throughput Over Ethernet and Up To 50% Lower Cost Per Token Per User in AI Inference Workloads MOUNTAIN VIEW, Calif., July 29, 2025- ...
Micron Technology, Inc. announced its integration of the HBM3E 36GB 12-high memory product into AMD's upcoming Instinct™ MI350 Series solutions, emphasizing both power efficiency and performance ...
TAIPEI -- ChangXin Memory Technologies (CXMT) is racing to produce China's first domestic high bandwidth memory, a critical component in artificial intelligence computing, as the country battles U.S.
Dublin, Sept. 19, 2025 (GLOBE NEWSWIRE) -- The "Dual In-line Memory Module (DIMM) Market Opportunity, Growth Drivers, Industry Trend Analysis, and Forecast 2025-2034" report has been added to ...
SEOUL/TAIPEI/SAN JOSE, Calif. -- Nvidia CEO Jensen Huang, wearing his trademark black leather jacket and addressing a conference in March in San Jose, did not hide his enthusiasm for high bandwidth ...
The Tokyo-based memory maker Kioxia anticipates demand for NAND storage will grow by roughly 20% each year as AI data center ...
Users of certain advanced AI systems might have noticed their favorite model can remember their preferences regarding tone, formatting, prior topics of interest, how they like responses structured and ...