Let’s Talk About AI

We may earn money or products from the companies mentioned in this post.

Generative AI has become a go-to scare topic for slow news days in recent years, and it’s a frequent topic of internet yelling, usually by people with a very limited grasp of how the technology actually works. I’m not going to claim that I’m expert on the topic, but since I’ve actually used AI, I do claim to be more informed than a lot of the people screaming about it.

Blasphemous Temple of Yargolith baanner
Check out The Blasphemous Temple of Yargolith, which was created with AI and then turned into something publishable by humankind.

Before we get into that though, let’s acknowledge the fact that AI is not going away. Silicon Valley has put too much money into it to let it die on the vine. They’ve also convinced a good chunk of corporate America that it’s the secret to getting rid of all those pesky employees who do nothing but sponge off the company. The people who actually run things have made a decision, so no matter what you think about AI, you’re going to have to get used to it.

Let’s start with the common AI complaint that’s easiest to dismiss: The claim that it will cost people jobs. While this is certainly true, as a person who comes from the first generation of his family to make it to the office job side of the working class, I have to sincerely say, “Fuck you.” The same people who whine and moan about AI taking jobs have been silent while the working class has suffered nearly a century of union-busing, downsizing, and outsourcing, so forgive me if I’m not sympathetic. The jobs are going away. Learn to live with it just like us poors did.

Brother Damo, Priest of Theon
Brother Damo, Priest of Theon

Commercial artist have a right to be worried. Since their work is by nature formulaic and impersonal, it’s going to be the easiest for AI to replace. Since most of them have enough experience with capitalism to know that their jobs are only safe if there’s no cheaper option, many of pass up “They took our jerbs!” in favor of “They took our interllectual propertry!”

At this point they generally reveal that they’ve studied IP law just as thoroughly as they’ve studied AI. The most moronic among them will find an AI render of, say, a woman in a red jacket who vaguely looks like Trinity form The Matrix. Then they’ll put it next to a drawing they’ve done of a woman in a red jacket who looks vaguely like Trinity from The Matrix and scream “Plagiarism.” If AI programs were the copypasta machines that people claim they are, you’d think by now one of them would have produced something that meets the legal definition of plagiarism.

It’s a testament to the power of human stupidity that they don’t see the obvious flaw in their argument: if the definition of plagiarism were as broad as they claim, practically everything would be plagiarism. Even the dude who drew the stunningly original image of Trinity in a red jacket would have to pay royalties to whoever owns the copyright on pictures of a woman with short dark hair and sunglasses wearing a red jacket (probably Disney).

King Yak, Kobold Chieftain
King Yak, Kobold Chieftain

A few try to claim that it’s not about AI “copying” particular images, it’s about AI copying an artist’s “style.” Again, that’s not how any of this works. Style is not something you can copyright. You might be able to trademark it as trade dress if it’s consistent and distinctive enough, but most illustration work isn’t going to qualify.

Those who are smart enough to acknowledge that “passing similarity” does not violate copyright often try to claim that the crime happens when content is used to train AIs. Early on, a lot of people thought that AI was just collaging together pieces of the work in its training set. These people have clearly never tried to make a collage that didn’t look like a collage, but they have a lot to say about AI images that include “signatures” in languages not known to humankind. None of it reveals much intellectual capacity on the part of the speaker.

The “collage hypothesis” inspired many to argue that AI violates IP rights by downloading images. I mean, we already know these people haven’t bothered to learn how things that are actually relevant to their jobs work. Did you really think they’d understand web browsers? If this claim were true, we’d have to go back to the days of text-only internet.

The ones who are smart enough to clear the porn from their caches have another argument, and this is one that some legal experts seem to think might work. It basically claims that it’s including the art in training data that violates IP. They usually cite a case that Google lost over its book archive.

Katie, 8-year-old
Katie, 8-year-old

I have my doubts though. Google was posting sections of books that were way larger than is generally considered fair use, and they were making it available to the public. AI doesn’t even keep the images is scans, it just collects data from them. The thing that goes into the training database is just a bunch of code with no resemblance to the original content. If scanning a file and collecting data from it is an IP violation, say goodbye to your virus scanner, your spam filter, and several other tools you’re probably going to miss.

Even if the training argument holds up in court, it might not be relevant for long. DeepMind recently announced that it’s experimenting with developing AI that learns through experience without a training set. For any applications that still require training sets, Wikimedia and Google are working on building public domain datasets for AI researchers to use.

The last argument that you’ll hear is that AI is going to destroy human creativity itself. That’s kind of a big one, so I’m going to save it for a follow-up article so this doesn’t get long enough to qualify as a “think piece.” I’m not a member of the Chattering Imbeciles Union and I don’t do scab work.

 

Kreos Banner
Banner created from some of the above images.