Several music bandshave removed their albums from the streaming service Spotify due to the investment policy of the streaming service's founder and CEO Daniel Ek. The reason was his involvement in the military AI technology business. Musicians explain their departure by their unwillingness to have their art associated with war. For Oninvest, Roman Mighty examined how the development of the global market for AI-technology weapons and military robots is leading to the emergence of new companies with a billion-dollar valuation - and at the same time to fierce ethical disputes.

"Lack of money is no longer an excuse."

In 2021, McKinsey partner Gundbert Scherf, game developer Torsten Reila, and ML engineer Niklas Köhlerhad a hard time hiring at their military startup Helsing - hiring engineers turned into a debate about the defense industry. Helsing employs 600 people in its offices in Munich, Berlin, London and Paris in 2025, and the company raised €600m in its latest funding round, after which investors uppedits valuation to €12 bn. In boththe first andlast round of leaders, the Yes Fund was a key investor For it, as for other investors, the impetus to invest in defense companies was the Russian invasion of Ukraine. 

The war has affected the entire defense market. In 2022 alone, military spending in Europe experienced the largest increase in 30 years, said a study by the Stockholm Peace Research Institute. In 2025, the EUagreed to create a €150 billion defense fund. Much of the spending is going toward new weapons, for which the Ukrainian front has already become a testing ground.

Militaries on both sides are testing and actively using drones withartificial intelligence elements,autonomous platforms,robotic systems. That's driving other countries as well. By 2034, the autonomous weapons market could grow from $44 billion to $73 billion, a compound annual growth rate of 5.86%, predicts Precedence Research. Autonomous systems have taken a third of funding in European defense startups in 6 years, McKinsey calculated.

"Money is no longer an excuse, it is now," German defense minister Mark Wietfeld, founder of startup ARX Robotics, was quoted as saying in a conversation with Reuters. The EU plans to allow European manufacturers to be admitted to military tenders inthe first place. Dozens of new companies are applying for these budgets. In May, two of them - German and Portuguese drone makers Quantum Systems and Tekever - attracted investments  in excess of a billion dollars.

Three other manufacturers of autonomous military systems with a valuation of more than a billion are operating in the US market - Anduril Industries, Skydio and Shield AI. There will be more - in July, U.S. President Donald Trump signed a 28-page plan to promote AI innovation, including in the defense industry. "The United States must aggressively deploy AI in the military if it is to maintain global military superiority," the document reads. But these plans have notable opposition.

In 2013, a number of non-governmental organizations, including Amnesty International, Human Rights Watch, Handicap International and other prominent NGOs, launched the "Stop Killer Robots" campaign calling for a ban on autonomous weapons.

Campaignerswarn that robotic weapons challenge human control over the use of force and dehumanize humans by handing over life-and-death decisions to algorithms.

Who will be responsible for the killer drone race 

In 2025, the UN held internationalconsultations on the legal, technical, ethical and humanitarian aspects of autonomous weapons: by next year, the organization plans to agree on legal issues to ban such weapons altogether.  "While the proliferation of such technologies in the military and law enforcement sectors is inevitable, it is not too late to establish legally binding rules restricting the production and use of autonomous weapons systems," said On

The decision to use force against or near people requires processing a high level of constantly changing contextual information that requires human understanding and sensitivity, the expert explains. The categorization of people by machine algorithms can create unacceptable risks of mis-targeting here.

Stop Killer Robots Executive Director Mary Wareham believes that AI mistakes on the battlefield will make it much harder to establish accountability for war crimes. "Who will be held responsible for such acts? The manufacturer? The person who created the algorithm?" -she asks.

"Investors and companies have a responsibility to respect human rights," Patrick Wilken responds in absentia. This responsibility includes drawing strict ethical boundaries around what types of technologies should be researched, developed, produced and sold, excluding systems that pose unacceptable risks to human rights." Although there is no legal prohibition on the development of autonomous systems, many developers find working for such companies morally unacceptable for themselves.

Defenders of a free society  

In 2018, Google employees staged a large-scale protest against a program to use artificial intelligence to analyze drone video. About 4,000 employees signed an internal petition demanding the contract be canceled, itwas not renewed. Protests against cooperation with the military (not always with apparent success) were held by employees of Microsoft and Amazon.

In 2022, a coalition of robot manufacturers Agility Robotics, ANYbotics, Boston Dynamics, Clearpath Robotics, Open Robotics and Unitree pledged not to use robots as weapons and to make sure their customers don't.

Boston Dynamicsallows its robots to be used by the military to perform logistical tasks, including ammunition delivery, Clearpath Roboticsworks with military customers, and Unitree robots have beenspotted in Chinese People's Liberation Army exercises with machine guns mounted on them;

Proponents of automated systems say their use could reduce military and civilian casualties by reducing the number of people on the battlefield and  eliminating  operator emotion and fatigue from the decision to strike. Helsing co-founder Torsten Reil says in an interview with Wired that he doesn't understand the logic of Valley engineers opposing collaboration with the military: "If we want to live in an open and free society, be who we want to be and say what we want to say, we need to be able to protect the public."

Who will decide whether a society is free or not remains unclear, the publication notes; Ryle ignored a question on the subject. Maybe such a decision will someday be made by a machine, too.

This article was AI-translated and verified by a human editor

Share