Taxing the Slop
Could taxes help tame the AI slop beast?
Most economists would tell you that if you want less of something, tax it. In classic economic theory, taxes, no matter how justified on other grounds, act as a disincentive to produce. So, in the aggregate, you’ll get less overall production.
Some commentators have lately proposed applying that principle to AI “slop.” In a widely read op-ed in the Guardian last month, technologist Mike Pepi, author of Against Platforms: Surviving Digital Utopia, proposed “a minuscule tax levied on the largest AI companies,” and using the money to fund cultural institutions, artists and researchers. “As it stands, AI slop is a malicious manipulation of human cognitive labor and the institutions that support it – something akin to a cognitive pollutant,” he wrote. “A ‘slop tax’ would ensure robust institutional support structures for human creativity forced to compete in a sea of meaningless content.”
The idea, and Pepi’s essay, have lately gained considerable traction on Reddit, suggesting he stuck a chord, at least among the type of folks who spend a lot of time debating things online.
There’s no doubt that AI slop has become the scourge of many in the creative industries. French music streamer Deezer recently reported that roughly 75,000 AI-generated tracks per day are uploaded to its platform, some 44% of all the songs uploaded, while capturing less than 3% of total streams. Amazon is awash in AI-generated e-books and the Podcast Index recently reported that fewer than half the new shows released on various platforms within the previous 24 hours were identifiably produced by human podcasters. Nearly all the rest appeared to be AI-generated, for which the industry has coined the term, “podslop” (about 10% could not be firmly classified).
The proposal to tax slop comes amid a larger debate around taxing AI more broadly as a means to offset its expected impact on the labor market, among other economic disruptions. Proponents of the idea also note that experts expect that AI will increase returns on capital over time while reducing returns from labor. In the U.S., which derives 85% of its income tax revenue from labor, that shift would greatly reduce overall revenue. No less a proponent of AI than OpenAI last month released a 13-page “Industrial Policy for the Intelligence Age” paper calling for “modernizing the tax base” to account for the changing composition of economic activity being wrought by the technology from mostly labor-based to mostly capital-based.
Critics of the idea of taxing AI outputs argue the idea is ultimately unworkable due to a basic definitional problem: what distinguishes “good” content from “bad” content, and who should do the distinguishing? It’s the same problem as with content moderation on social media platforms: what gets filtered and what gets through, and who decides?
There is another approach that might be more workable, however. As I discussed in my previous post, AI is good at nothing so much as generating negative externalities. In economics, a negative externality is one where parties that did not content to, or profit from, a transaction nonetheless end up shouldering some of the cost. Air pollution and carbon emissions from power generation and other industrial activity are classic examples. The market for power generation does not internalize a price on the negative downstream effects of that pollution, so the costs fall on humans, plants and animals that have to breathe the air in the form of harms to human health and climate change.
Rather than trying to define the specific outputs to tax in the case of AI slop—rather than trying to distinguish between “good” outputs and “sloppy” outputs—tax policy could treat the sheer volume of outputs as the externality and impose a levy on producers according to the volume of what they produce. It’s a concept developed by the British economist Arthur Cecil Pigou in the 1920s in his book The Economics of Welfare. In it, Pigou proposed imposing a tax on producers to deal with the problem of over-production.
The idea in the case of AI slop is that if generating 75,000 low-value music tracks cost a few dollars more, and the extra cost increased in proportion to the volume, there would be less aggregate incentive to generate them, and you would end up with less low-value production.
There would still be questions about who should pay how much of the tax. Should OpenAI be taxed on all the volume produced by all the applications and integrations that used ChatGPT, or should the tax burden be more evenly shared? But those are basically political questions that could be debated and decided just as income tax policy is now, rather than questions of taste or cultural preference around “good” content and “bad” content that are inherently fraught.
Moreover, in an attention-based competitive context such as the media and entertainment industries, volume equates to market power. Netflix spends tens of billions of dollars to churn out content because it wants to capture and hold the attention of as much of the video streaming audience as possible, and generates revenue by monetizing that attention. To whatever extent AI slop captures the audience’s attention it detracts from human producers competing for the attention of the same audience, at an artificially lower marginal cost, suppressing returns on human production.
Putting a price on the production of slop could help rebalance those scales.

