Musk vs. OpenAI trial: AI researcher Stuart Russell warns of 'winner-takes-all' dynamic in AGI development
What it really says
On day five of the Musk v. OpenAI trial, Stuart Russell testified on May 4, 2026, as the only AI expert witness for Musk's side. Russell is a professor of computer science at UC Berkeley who has researched artificial intelligence for decades. His task was to provide the jury with background on AI and establish why the technology is concerning enough to warrant safety measures. Russell warned of a 'winner-takes-all' dynamic in AGI (Artificial General Intelligence) development and emphasized the tension between the pursuit of AGI and safety. He has criticized the arms-race dynamic between frontier labs globally competing to achieve AGI first for years and called for stricter government regulation. However, Judge Yvonne Gonzalez Rogers significantly limited Russell's testimony after objections from OpenAI's attorneys. She stated clearly: 'This is not a trial about AI safety risks.' Russell's broader concerns about existential threats from unconstrained AI could not be presented in court. The trial began on April 28 in Oakland. Musk seeks $130 billion in damages and the removal of Sam Altman as CEO. The core of the dispute: Musk accuses OpenAI of breaking its original nonprofit promise by converting into a for-profit company. Russell is being paid $235,000 for his work on the trial by Excession, Musk's family office.
Our assessment
The warning about an AGI arms race is scientifically well-founded - Russell is one of the world's leading AI researchers and co-author of the standard textbook 'Artificial Intelligence: A Modern Approach.' His concerns are shared by numerous other researchers. However, context matters: Russell is testifying as a paid witness in a trial primarily about money and corporate control, not AI safety - as the judge herself clarified. Musk's own company xAI engages in the same arms race Russell criticizes and has similar Pentagon contracts as OpenAI. The substantive warning should nonetheless be taken seriously: the dynamic where multiple companies race to achieve AGI while potentially sacrificing safety for speed is a real structural problem. Whether a court trial is the right venue to address this is debatable. Russell's core argument - that AGI development needs international regulation, not just corporate promises - remains relevant regardless of the trial's outcome.
Relevance for Germany
Relevant for Germany because the trial raises fundamental governance questions that also affect the EU. The EU has chosen a regulatory approach with the AI Act that addresses exactly the safety questions Russell raises - though the AI Act focuses on specific applications, not foundational AGI research. The debate demonstrates that corporate self-commitments (like OpenAI's original nonprofit promise) are fragile when economic interests prevail. This supports the European argument for binding regulation over voluntary pledges. For German AI researchers and companies, the central question is: how can one pursue ambitious AI research without falling into the same race dynamic? The European strategy relies on open-source models (like Mistral, Aleph Alpha) and cooperative approaches, representing a counter-model to the American 'winner-takes-all' approach.
Fact check
Facts are well-documented through multiple independent media outlets. TechCrunch reports on May 4 in detail about Russell's testimony, his fee ($235,000 from Excession), and the judge's limitations. MIT Technology Review provides two reports on the first trial week, including Musk's admission that xAI distills OpenAI models. The judge's quote ('This is not a trial about AI safety risks') is reported by TechCrunch. The $130 billion demand is confirmed by SpazioCrypto and other sources. Russell's signing of the March 2023 moratorium letter is historically documented. The caveat that Russell testifies as a paid witness is relevant for contextualizing his statements and is transparently reported by TechCrunch.
Source
- • TechCrunch 04.05.2026 (techcrunch.com/2026/05/04/elon-musks-only-expert-witness-at-the-openai-trial-fears-an-agi-arms-race/)
- • MIT Technology Review 04.05.2026 (technologyreview.com/2026/05/04/1136826/week-one-of-the-musk-v-altman-trial-what-it-was-like-in-the-room/)
- • MIT Technology Review 01.05.2026 (technologyreview.com/2026/05/01/1136800/musk-v-altman-week-1-musk-says-he-was-duped-warns-ai-could-kill-us-all-and-admits-that-xai-distills-openais-models/)
- • CNBC 30.04.2026 (cnbc.com/2026/04/30/openai-trial-elon-musk-sam-altman-live-updates.html)
- • KQED 04.05.2026 (kqed.org/news/12081916/are-elon-musk-and-openai-fighting-an-ai-arms-race-sam-altmans-lawyers-think-so)