We may earn money or products from the companies mentioned in this post.
I’ve spent three articles telling you why the most common arguments used by AI opponents are stupid. Most are based on a refusal to understand how anything remotely related to AI—including the software itself, computers in general, intellectual property law, and the creative process—actually works. The rest are based on magical thinking. There is one, however, that has at least something adjacent to merit: The belief that the relative ease of making things with generative AI will lead to a glut of crap. I agree that this will happen, I’m just not convinced that it will affect the average person. Let’s look at who might use generative AI and whether they’ve got any incentives or pathologies that would nudge them for using AI to create garbage.
Actual Artists

As I mentioned previously, most of us who do creative work don’t do it to make a quick buck, we do it because that’s what we do and we don’t have much of a choice in the matter. That being the case, it’s unlikely that any creators who you currently know are going to start shitting out AI knockoffs of their own work any time soon. That’s not to say that creative people are going to avoid AI, though.
What will artists use AI for? Some will use it in interesting ways to make completely unexpected types of art. I’m not going to go into details here because I just read an article that says it better and in more depth than I have room for here. If you’re curious how AI might change what we consider art, check out “What J Dilla and Early Hip-Hop Teach Us About Ai and the Future of Creativity” on the Forte Labs blog.
Some of us will also use AI to do some of the routine crap that steals time we could be using to do creative work. For example, I hate writing marketing copy and I’m not very good at it. I’m not sure if there’s a cause and effect relationship there (or which direction it points) or if I’ve just listened to enough Bill Hicks that anything even tangentially related to marketing makes me feel dirty. In any case, I hate it and I suck at it, so lately I’ve been feeding my shitty marketing text into ChatGPT for improvements. It can’t create good blurbs from one prompt yet, but if I generate several I can usually piece them together into something better than I started with. ChatGPT is good at spitting out the kind of insincere but excited marketing copy that I can never come up with.
“Makers”
Next up we have the wannabe artists who never actually make anything. You’d think they’d avoid AI because they’ve been so vocal about how it’s going to destroy the world, but there are two problems with that. One is the fact that memory-holing deeply-held beliefs in exchange for tribal acceptance has basically become expected in the Trump era. The other is that these people desperately want to be recognized as artists, and some of them will be willing to “cheat” to do it.
But. I don’t think there’s gonna be a lot of “Creatives” using AI to make stuff, at least not stuff we’re going to have to deal with. Even if they can farm all the creativity out to AI, getting a product to market is still going to involve doing things that are not fun, which is where a lot of these folks drop the ball without AI. The other problem is that actually releasing creative work exposes the creator to criticism. Some “makers” actually go a long way toward finishing things, they just never show them to anyone out of fear of criticism. Even if these people manage to stumble through finishing something, they won’t release it into the world out of fear of being told that they’re not a precious little genius.
Dirtbags
This is a group some people might not be familiar with. I first encountered them during my brief foray into the world of writing for content farms, but at least at the time (15-20 years ago), you could find them on every website and forum where desperate writers looked for work. The gig was always to write a particular kind of book using particular (SEO) words with a certain frequency. There were no quality expectations set, because the people doing the hiring didn’t care about quality. Their scheme was a to upload a book to Amazon with the right keywords to make sure it showed up high in as many searches as possible, which presumably led to enough sales to earn some kind of profit. They didn’t have to worry about losing repeat customers since most of the books were published under different author names, often with no publisher listed.
The people who are perfectly willing to release garbage in order to make a buck have been around since long before AI, and they’re already releasing fully AI-generated content. When you don’t care whether it’s trash as long as the search engines find it, that’s pretty simple math. The good news is that since these people have been around forever, most platforms already have polices and procedures in place to flag and remove or sergregate them. I hope they’ll continue a similar policy with AI. My only worry is that some platforms will follow the lead of the big players in the (always regressive) RPG industry sales platforms, which has basically taken a “if AI is involved in any way we remove it or label it as garbage” stance. That’s why outside of my Blasphemous Temple of Yargolith experiment, I only use AI content for websites—using it in an RPG relegates the game to the AI ghetto within the RPG ghetto.

Corporations
I say “corporations” here and not “companies” because AI adoption for creative work is going to be much easier for bigger companies than small ones. In addition to having more money to sink into customized models and experimentation, larger companies have the advantage that most of the writing and art in their products is mostly anonymous. Nobody knows who drew that picture of Mickey Mouse, so nobody knows when you farm it out to AI. Those of us who credit our artists can’t sneak AI content in as easily.
Large corporations also have the advantage of usually operating with some kind of monopoly or regulatory capture that protects them from accountability and market forces, so if you don’t like that they’re using AI, too bad. They know you’re not going to stop buying their shit, so they’re going to make it in the cheapest way possible. That’s not to say that Wal-Mart is necessarily going to start an AI book division. They’re going to use it to generate ad copy and social media posts and other marketing crap, and it’s going to be a tiny percentage of how they use AI.
What We Should Really Be Talking About
That brings us to the (only) real threat AI poses: it’s going to take jobs away from normal people who don’t do anything remotely creative. That’s where AI is really going to change things, and it’s going to happen very soon. Several companies have already fired their customer support departments in the belief that AI could do the job better and cheaper. Most have since realized that they need at least some humans and rehired part of their workforce, but AI’s going to keep improving until that’s no longer the case. The AI-related discussion that we should be having is about what we’re going to do with all the unemployment that AI will cause for completely non-creative people. Unfortunately, between our culture’s reverence for the Protestant Work Ethic (at least for those without generational wealth) and the whole idea of the commons buried somewhere under generations of neo-liberal propaganda, I don’t have high hopes.

