Micron is taking over the US: chipmaker to invest $150 billion in factories due to AI memory shortage
The company is building mega-factories in New York and Idaho to reduce the technology gap between memory chips and AI infrastructure

Micron, the largest U.S. memory chip maker, will commit $150 billion to build new factories in two U.S. cities amid the AI boom. Photo: bluestork/Shutterstock
Micron, the largest memory chip maker in the US, will allocate $150 billion to expand capacity and build new plants in two US cities amid the AI boom, The Wall Street Journal (WSJ) writes. As large language models become more complex and large AI companies announce large-scale plans to build data centers, demand in the memory market is far outstripping supply, the newspaper points out.
Details
Micron will set aside $50 billion to more than double the size of its campus in Boise, Idaho, where the company is headquartered, the WSJ reports. The project includes the construction of two new memory chip component factories that are expected to begin operations by the end of 2028. The fabs will produce DRAM, a memory chip used to create High Bandwidth Memory (HBM) chips, which are becoming critical for advanced AI computing and are used in data centers, WSJ notes.
In addition, near Syracuse, N.Y., Micron broke ground on a $100 billion complex, the largest private investment in the state's history, the newspaper points out.
This is not Micron's first such investment, the WSJ points out. In late 2025, the company also announced a $9.6 billion investment to build an advanced memory chip plant in Hiroshima, Japan, while Micron competitor SK Hynix announced plans in January to build a $13 billion fab in South Korea in addition to a $4 billion production facility in Indiana.
Why Micron decided to expand
The WSJ attributes Micron's expansion in the U.S. to the fact that graphics processing units (GPUs) from major chipmakers like Nvidia, AMD and Broadcom are growing faster than the memory chip market and require more and more fast memory chips. They need them both to train AI models and to perform the inference process, which is the process of processing queries and providing answers to users.
When AI moves from the learning stage to inference, the amount of data that needs to be pumped through the memory system every second becomes enormous, WSJ points out, noting that because of this, the current architecture simply doesn't have time to "feed" the powerful processor with data, causing expensive hardware to idle.
Micron's $150 billion investment is aimed at creating infrastructure for HBM production. Unlike conventional memory boards, these chips can radically increase the speed of data exchange and reduce power consumption, explains WSJ.
However, implementation of Micron's plan to expand production has faced a shortage of "clean zones" - specialized production facilities with a near-sterile environment, which is necessary to create components for memory chips, the publication notes. Micron admits that it underestimated the scale of demand for memory chips during the transition to mass adoption of AI, and now has to urgently build new capacity to prevent a long-term shortage that could slow down the entire AI industry, the WSJ concludes.
What's up with Micron stock
The shortage of memory chips has turned into a "gold rush" for their manufacturers: Micron and its two main competitors - SK Hynix and Samsung. Since April 2025, the low point at which Micron's stock was trading at $62 apiece, Micron's stock price has more than six times its value, to about $411 at the close of trading on Feb. 13, and the company's market value has approached half a trillion dollars.
At the pre-market on February 17, Micron securities decrease by more than 2%. At the same time, most analysts expect the stock to grow in the near future. According to MarketWatch, 44 analysts who track the company suggest buying its shares, another four suggest holding and two advise selling.
This article was AI-translated and verified by a human editor
