Nonprofit Challenges OpenAI's For-Profit Shift with Elon Musk

Encode Supports Musk's Injunction Against OpenAI's For-Profit Transition
The nonprofit organization Encode, a co-sponsor of California’s SB 1047 AI safety legislation, has formally requested permission to submit an amicus brief. This brief would lend support to Elon Musk’s legal action seeking an injunction against OpenAI’s planned shift to a for-profit structure.
Concerns Over OpenAI's Mission
In a proposed brief filed with the U.S. District Court for the Northern District of California on Friday, Encode’s legal counsel argued that OpenAI’s conversion to a for-profit entity would potentially jeopardize its core mission. This mission centers on the safe and beneficial development and deployment of transformative technology.
The brief emphasizes that OpenAI and its CEO, Sam Altman, are actively developing technologies with the potential to reshape society. Therefore, the public possesses a significant stake in ensuring this technology remains under the control of a public charity. Such a charity would be legally obligated to prioritize safety and public benefit over financial gains for investors.
Accusations of Prioritizing Profit Over Safety
Sneha Revanur, founder and president of Encode, has publicly stated that OpenAI is effectively “internalizing the profits of AI” while simultaneously “externalizing the consequences to all of humanity.” She asserts that judicial intervention is necessary to guarantee that AI development aligns with the broader public interest.
Expert Support for Encode's Position
Encode’s brief has received endorsements from prominent figures in the AI field, including Geoffrey Hinton, a 2024 Nobel Laureate and pioneer in artificial intelligence, and Stuart Russell, a computer science professor at UC Berkeley and director of the Center for Human-Compatible AI.
Hinton highlighted that OpenAI was initially established as a safety-focused nonprofit and benefited from various tax advantages due to its status. He cautioned that allowing the company to abandon these commitments when they become inconvenient would set a detrimental precedent for the entire AI ecosystem.
OpenAI's Evolving Structure
Launched in 2015 as a nonprofit research lab, OpenAI adapted its structure as its projects became increasingly resource-intensive. This involved securing investments from venture capitalists and companies, notably Microsoft.
Currently, OpenAI operates with a hybrid model. A for-profit arm is overseen by a nonprofit entity, with a capped profit-sharing arrangement for investors and employees. However, the company recently announced plans to transition its for-profit side into a Delaware Public Benefit Corporation (PBC).
Musk's Legal Challenge
Elon Musk, an early contributor to the original nonprofit, initiated a lawsuit in November seeking to halt this proposed change. He alleges that OpenAI is abandoning its original philanthropic goals of making AI research accessible to all. He also claims anticompetitive practices that disadvantage rivals, including his own AI startup, xAI.
OpenAI has dismissed Musk’s claims as unfounded, characterizing them as a result of dissatisfaction.
Meta's Support for Blocking the Conversion
Meta, Facebook’s parent company and a competitor in the AI space, is also actively supporting efforts to prevent OpenAI’s conversion. In December, Meta sent a letter to California Attorney General Rob Bonta, warning that allowing the shift could have “seismic implications for Silicon Valley.”
Concerns Regarding PBC Control
Encode’s lawyers contend that transferring control to a PBC would fundamentally alter OpenAI’s obligations. The organization would shift from being legally bound to prioritize AI safety to being legally required to “balance” public benefit against the financial interests of its stockholders.
The brief points out that OpenAI’s nonprofit currently commits to avoiding competition with “value-aligned, safety-conscious projects” developing AGI. This commitment could be diminished under a for-profit structure. Furthermore, the nonprofit board’s ability to cancel investors’ equity for safety reasons would be lost upon completion of the restructuring.
Talent Outflow and Safety Concerns
OpenAI is currently experiencing a departure of key personnel, partly attributed to concerns that the company is prioritizing commercial products over safety considerations. Miles Brundage, a former policy researcher who left OpenAI in October, expressed worries that the nonprofit could become secondary to the PBC, allowing the latter to operate without adequately addressing potential risks.
Encode’s brief further argues that OpenAI’s fiduciary duty to humanity would be eliminated under Delaware law, which does not require PBC directors to prioritize the public interest. The brief concludes that allowing the safety-focused nonprofit to relinquish control to a for-profit enterprise lacking a binding safety commitment would be detrimental.
Encode's Background and Advocacy
Founded in July 2020 by Sneha Revanur, Encode is a volunteer network dedicated to amplifying the voices of younger generations in discussions about the impacts of AI. Encode has actively contributed to various AI-related legislative efforts at both the state and federal levels, including SB 1047, the White House’s AI Bill of Rights, and President Biden’s executive order on AI.
Updated December 30, 2024, with statements from Revanur and Hinton.
TechCrunch has an AI-focused newsletter! Sign up here to get it in your inbox every Wednesday.
Related Posts

Amazon Appoints Peter DeSantis to Lead New AI Organization

Google Launches Gemini 3 Flash - Now Default in Gemini App

Mozilla CEO on AI in Firefox: A Choice for Users

Google Opal: Vibe-Coding Tool Now Available in Gemini

Amazon Reportedly in Talks for $10B OpenAI Investment
