Well. That was a week. On Monday, Ed Newton-Rex, the widely respected head of StabilityAI’s audio team, abruptly resigned over a disagreement with the company’s position on the use of copyrighted works to train generative AI models. On Friday, OpenAI even more abruptly sacked its high-profile co-founder and CEO Sam Altman and set a new standard for mangling internal communication and investor relations. And somewhere along the way, Meta reportedly dissolved its Responsible AI team and reassigned its staffers to other units.
The turmoil continued through the weekend. OpenAI president Greg Brockman quit in solidarity with Altman, employees threatened mass resignations and its blindsided investors demanded a restoration and lined up to finance any new AI venture Altman and Brockman might launch. By Saturday evening, talks reportedly were underway between Altman and the company to bring he and his team back into the fold.
(Update: As of Monday morning, Altman and Brockman had joined Microsoft, OpenAI’s largest shareholder, to lead a new advanced AI research team.)
Apart from the StabilityAI case, the conflicts were not directly related to the ongoing controversy around generative AI and copyright. But they could have significant indirect effects on how that controversy unfolds.
For one thing, the sudden moves made a lot of noise. Newton-Rex went very public with his disagreement with the Stable Diffusion-developer, publishing an op-ed in Music Business Worldwide and giving a series of media interviews. The precipitating event that led to his resignation, he confirmed to me in an interview, was Stable Diffusion’s comments to the U.S. Copyright Office in its inquiry into AI and copyright, in which the company baldly declared its view that the use of copyrighted works to train its models is permissible under the fair use doctrine and does not require authorization or remuneration to rights owners. A composer in his own right, Newton-Rex takes strong exception to that view and proudly describes how Stable Diffusion’s music generator that he helped develop was trained on 800,000 fully licensed tracks.
Altman’s sudden defenestration set off near-thermonuclear explosions on Wall Street and Sand Hill, where he has significant ties from his time heading tech startup accelerator Y Combinator. It was also probably noted on Capitol Hill and the Biden Administration where Altman had also nurtured ties through his scrupulously non-confrontational testimony at congressional hearings and private demos for members, and his dutiful appearances at AI-related ceremonial events at the White House.
Whether he intended to or not, Newton-Rex could now well find himself being deposed by plaintiff’s counsel in the pending copyright infringement lawsuit against his former employer and perhaps in other similar cases.
OpenAI’s ChatGPT has been at the center of many of the debates around AI and copyright. While Altman had given lip service to working with rights owners on a system for opting out of AI training and other measures, the company is now likely heading into an uncertain period of internal turmoil, with additional resignations possible and the possibility of major changes in its governance and management structure coming. On Monday, the company named Twitch co-founder Emmett Shear as its new CEO, ending Mira Murati’s 48-hour reign. She will now return to her post as chief technology officer, at least for now.
For its part, Meta managed to keep its dismantling of its AI safety team mostly quiet. But as news has trickled out, it will no doubt be noted in Europe, where the European Union is grinding away at enacting major safety-focused AI regulation.
With OpenAI’s future uncertain, leadership of the AI industry has now clearly passed to Microsoft and Google, with Meta and Amazon hurrying to close the gap. In other words, generative AI is rapidly being subsumed into an explicitly commercial contest among incumbent technology giants and likely to become just another pillar of Big Tech’s digital hegemony. And the debate around AI and copyright is in danger of becoming just another battle between rights owners and technology companies, with the usual players aligned in their usual formations.
Watch List
Touch and go With just two weeks to go before the next scheduled trilogue meeting on the EU AI Act, the effort to close a deal on the final text of first major regulatory regime for AI suddenly looks on shaky ground. Over the weekend, France, Germany and Italy, the three largest countries and major powers within the EU, announced an agreement on an approach that goes against the tiered approach to the regulation that seemingly had been agreed to among the member states. The tripartite powers propose a uniform approach but one that would exempt foundation models from the strictest rules for now in favor of a voluntary code of conduct. Instead they would immediately regulate only downstream applications of the technology. “The inherent risks lie in the application of AI systems rather than in the technology itself,” they said in a paper circulated Sunday. “When it comes to foundation models, we oppose instoring un-tested norms and suggest to build in the meantime on mandatory self-regulation through codes of conduct.” An EU Parliament source called the proposal “a declaration of war.” Euractiv. Reuters.
Open, and shut The world of scientific and academic journal publishing has been turned upside down over the past decade by increased demands by governments and other major research funders that the publication of scholarly articles based on the research they support is made available on an open-access basis rather than exclusively in paid journals. That’s raised questions about who should pay for the selection, peer review and processing of academic papers previously supported by subscription fees. The most common solution has been to shift those costs onto the researchers submitting the articles for consideration. But that, too, has been controversial. In April the entire editorial boards of two leading neuroimaging journals published by Elsevier resigned over what they said were excessive so-called article processing charges (APCs). But change may be on the way, via a series of innovative models being tested by governments, funding agencies, libraries and scientific institutes. Information may want to be free, but someone still has to pay to bring it to light. Nature.
Gobble gobble We’ll be taking a break for Thanksgiving later this week, so no newsletter this Friday. Happy Thanksgiving to all.