Among the leaders in the global AI race, the United States and China are most often named. American language models - ChatGPT, Grok, Claude - are recognized as the best (according to Stanford, for example, and LMArena), the key AI chip supplier, Nvidia, also comes from the United States. The most famous projects from China are DeepSeek and Alibaba's Qwen model; in France, Mistral; in the Middle East, government investments have allowed the launch of their own models - Falcon LLM in the UAE and Jais in Saudi Arabia.

But there is another contender in Asia - South Korea. "American and Chinese LLMs used to be way ahead of everyone else, but that's no longer the case," Sung Kim, CEO of Seoul-based Upstage, told the Financial Times. In July, the company released its Solar Pro 2 language model, which was ranked among the 11 strongest in the world on benchmarks - according to Artificial Analysis' Intelligence Index, its combined score is higher than Anthropic's Claude 3.7 Sonnet Thinking, DeepSeek V3 and OpenAI's GPT-4.1. Kim says the model was built using only 10% of the number of AI chips that other companies deploy. That is, it cost less.

Now Seoul is trying to organize its own AI revolution. The goal is to become one of the top three leaders in the global AI race. In the second half of 2025, the South Korean government will launch 30 priority projects in the field of AI and innovation, for which tax incentives and regulatory relief are provided. The list includes technologies for robots, autos, ships, home appliances, drones, factories and chips, as well as advanced materials and K-beauty and K-food, the country's calling card products. Korea will also have a special $71.6 billion public-private fund to invest in strategic sectors. It will be invested not only by giants like Samsung and LG, but also by dozens of startups.

Ironically, Seoul was forced to accelerate the development of its own AI sector, among other things, by the consequences of the US trade war. In late July, the US announced the imposition of duties of 15% on imports from the country, after which the South Korean Finance Ministry lowered its GDP growth forecast for 2025 from 1.8% to 0.9%. Another reason for the shrinking economy is demographic decline. "Greater transformation" towards AI is the only way to overcome the decline, Reuters quoted the South Korean Finance Ministry as saying.

Who the tech giants are betting on

The already mentioned startup Upstage raised money from Amazon and AMD, among others, in August 2025 during a $45 million bridge round (as part of a $157 million funding round). The company has also partnered with Amazon's cloud division, under which the US company will provide its cloud and infrastructure for training and deploying models.

Another Korean company, FuriosaAI, founded in 2017 by former Samsung and AMD engineer Jun Paik, wanted to buy Meta for $800 million but was turned down in March this year.

FuriosaAI is one of the few Asian startups that attracted Meta's interest. As a result, the company, which produces chips and gas pedals for AI, continued to operate independently. Moreover, it plans to compete with Nvidia.

Nowadays, most companies use Nvidia GPUs to train models - they are versatile and allow for parallel computing. FuriosaAI's RNGD chip is a specialized integrated circuit adapted specifically for running large language models. Its power efficiency is 2.25 times higher than that of GPUs.

In late July, FuriosaAI made its first major deal: LG's AI division approved the use of RNGD to train its own Exaone models.

In total, FuriosaAI has raised $246 million in investment since its founding in 2017, and its valuation has reached $735 million. The company is looking to further raise new capital and eventually go public, Bloomberg wrote , citing sources.

What do you need to know about public companies in the AI sector?

The largest semiconductor manufacturers in South Korea are Samsung Electronics and SK Hynix.

Samsung Electronics posted its fourth consecutive quarterly revenue decline in the semiconductor business in Q2 2025, due to U.S. restrictions on chip exports to China and supply delays for Nvidia.

The segment's operating profit fell to 400 billion won ($286.7 million) from 1.1 trillion won ($788.5 million) a quarter earlier. Net profit for the entire business plunged 48% year-on-year to 5.1 trillion won ($3.67 billion), which was below the FactSet consensus forecast of 5.74 trillion won.

According to TrendForce, Samsung's share of global contract chipmaker revenue in the first quarter of 2025 fell from 8.1% to 7.7%, while Taiwan's TSMC increased its share to 67.6%.

"Samsung's semiconductor business is at a critical juncture between survival and profitability. The company must attract more customers to justify massive investments in advanced nodes and compete with TSMC, which consistently performs better in terms of profitability and efficiency," notes Neil Shah, vice president of Counterpoint Research.

It's the cons.

But there is a silver lining. The company's shares have grown by more than 30% since the beginning of the year.

The company expects to restore earnings growth in the second half of the year due to demand for AI technology.

Samsung in July signed the largest contract in its history - with Tesla - for $16.5 billion. The Korean company will produce new AI6 chips for Tesla's unmanned cars at a plant in Texas in 2028-2029. According to Tesla CEO Elon Musk, Samsung also produces the current AI4 system used in Tesla's "autopilots."

Another strategic direction is HBM memory. For data centers, Nvidia produces powerful GPUs that do the calculations, and Samsung produces HBM, the "RAM" for these chips. Without it, even the fastest processor can't run large models efficiently.

Samsung is stepping up its efforts in this direction: it is developing HBM3E memory, which offers a bandwidth of 1.28 terabytes per second, a 50 percent increase over its previous version.

Investor interest in Samsung's return to the high-speed memory market has grown over the past month (through the end of July), Bloomberg quoted JPMorgan Chase & Co. analyst Jay Kwon as saying. Jay Kwon.

According to the agency, net inflows into Samsung stock totaled $1.2 billion in July.

SK Hynix

Samsung's main competitor in the chip memory market is another Korean company - SK Hynix. Since the beginning of the year, its shares have risen in price by more than 50%. But, writes Bloomberg, in July the outflow of funds from these securities amounted to more than $200 million.

According to Counterpoint Research, SK Hynix controls 62% of global HBM shipments, while Samsung's share of this market is only 17%. In addition, SK Hynix has surpassed Samsung for the first time this year in terms of memory business revenue. SK Hynix's operating profit in the second quarter rose 68% year on year to 9.2 trillion won ($6.6 billion), mainly due to HBM sales.

SK Hynix's strong position is that Nvidia is heavily dependent on its products. AMD, Intel, Microsoft, Amazon and Google are also among its customers.

The growth trajectories of the two South Korean giants began to diverge in the first half of 2024, helped by a surge in demand for Nvidia's AI products and related HBM, MS Hwang, research director at Counterpoint, told Bloomberg.

SK Hynix is unlikely to cede the lead to Samsung in the coming years: the company has announced plans to ramp up investment in additional capacity. It is building new plants in Yongyin (South Korea) and Indiana (USA). The bulk of this year's investment is going to HBM equipment. SK Hynix predicts that the AI memory chip market will grow at 30% per year through 2030.

But the market seems to think it's time to bet on Samsung. "For a few years, foreign investors have mostly taken a 'long Hynix - short Samsung' stance, but interest is now shifting toward Samsung," Cho Junki of SK Securities told Bloomberg in mid-July.

In July, Goldman Sachs downgraded SK Hynix from "buy" to "neutral" for the first time in more than three years. The bank's analysts suggest that HBM prices may decline in 2026 amid increased competition and Nvidia's growing influence on pricing.

Prior to that, the company was downgraded by Mirae Asset Securities Co. (while raising its target price from 244,000 to 300,000 won ). It said SK Hynix's HBM market share may start to decline in the second half of next year, when Samsung and U.S. Micron start shipping its next-generation memory. According to analysts, the company's shares have been rising this year due to Samsung's delay in supplying HBM3E chips. However, now this factor has already been taken into account in SK Hynix quotations and the securities have already reached fair value.

LG Electronics

Another Korean company - LG Electronics - is also preparing equipment for HBM production, Seoul Economic Daily wrote in July, citing sources. It is about modules with hybrid bonding technology - direct bonding of memory layers, which allows to create thinner and faster chips. According to the publication, LG expects to begin mass production by 2028. The company itself said that they are conducting technical studies of hybrid modules for HBM chips, but the specific timing of mass production has not yet been confirmed.

It's true that competition in this market is high. And companies that have spent years honing their skills in manufacturing AI chips have a significant advantage, Hyundai Motor Securities analyst Greg Roe told Bloomberg.

Other Korean manufacturers of equipment for hybrid bonding memory production include Hanmi Semiconductor, Semes (a Samsung subsidiary) and Hanwha Semitech's Hanwha Vision division.

In addition to memory chips, LG is also developing its own AI-based products. In 2020, the corporation established LG AI Research, which develops the Exaone family of large language models. In July 2025, it unveiled the Exaone 4.0 "thinking" hybrid model. It handles everyday tasks well and outperforms Alibaba's Qwen, Microsoft's Phi-4 and Mistral's Magistral on benchmarks in math, science and programming tasks. True, it's still inferior to the top DeepSeek models, according to a study published by LG AI Research.

The 32 billion parameter version of the model is designed for heavy corporate tasks, while the lightweight 1.2 billion parameter version can run directly on smartphones, laptops or IoT sensors.

And perhaps herein lies the next opportunity for a breakthrough. The more parameters a model has, the higher its potential capabilities, but the more energy-consuming the training and operation process. By comparison, U.S. industry leaders typically use 100-200 billion parameters, while Elon Musk's Grok 4 from xAI has 1.7 trillion. But such systems require giant data centers and cannot run locally on a phone.

But one of the advantages of Exaone 4.0 is high efficiency with a relatively small number of parameters. This means that in the future, LG's lightweight models could be run not only in the cloud, but also directly on devices. In theory, an LG smartphone will be able to provide instant translation or personalized AI prompts without going to a server, and home appliances will be able to predict service based on "reasoning" AI. A refrigerator, for example, will be able to explain why the cooling temperature is dropping or remind you that you are out of milk, and a washing machine will be able to suggest the optimal washing mode.

This article was AI-translated and verified by a human editor

Share