AI is driving significant investments in computing, networking, storage and memory for training models. Continuing AI investments depend upon monetizing these models, and that means using them in practical applications. At the 2026 CES show and at other events there has been a lot of talk about using AI for consumer and industrial uses. Like for training, AI inference also requires memory and storage to hold the results of model training and to assist using those results for practical applications.
However, while DRAM and SRAM play important roles in AI interference, for inference at the edge and for devices running on batteries, new non-volatile memory technologies are playing an increasing role and could replace these volatile memories where the data disappears when the power goes off. Coughlin Associates and Objective Analysis wrote a recent report on New Memories: Not Just for AI. This post will explore some results from this new report. I am one of the authors.
Nonvolatile memories offer energy savings that are appealing to designers of battery and ambient powered devices and also for energy savings, and performance improvements in data centers. Companies are introducing these nonvolatile memory technologies in stand-alone as well as embedded solutions.
A nonvolatile main memory and cache memory in a computing device will reduce power usage directly as well as enable new power saving modes, provide faster recovery from power off, and enable more stable computer architectures that retain their state even when power is off.
The report covers phase change memory, PCM, resistive random-access memory, ReRAM, ferroelectric RAM, FRAM, and various types of magnetic RAM, MRAM, as well as a variety of less mainstream technologies.
New memories such as MRAM and ReRAM are already available for memory in embedded devices for consumer and industrial applications, replacing NOR flash for code storage and supplementing SRAM for storing data. For instance, AI weighting values can be stored in the smaller new memory cells, which retain data when the power is off. This is important in energy constrained applications. Stand-alone new non-volatile memories are also available and are being used in data center and enterprise applications.
As the number of embedded applications increases the costs for making these new memories will decrease and the yields will improve. This created a virtuous cycle that encourages more and more applications and driving memory capacity shipments and revenues.
The report projects that total baseline new memory annual shipping capacity will increase over 9,000 times from 2024 through 2035, with revenues increasing over 300 times over the same period. The bulk of this rapid revenue growth will be supported by new memory technologies' displacement of SRAM, NOR flash and some DRAM.
The report also projects that capital equipment investments to enable this growth in new memory capacity shipments and revenue will likely grow by 1,200 times from 2024 through 2035.
The recent New Memories: Not just for AI report explores the role that new non-volatile memories will play in monetizing AI, leading to significant revenue growth for manufacturers and equipment suppliers.