the brave, new, exploitative world of generative a.i.
“Your scientists were so preoccupied with whether they could, they didn’t stop to think if they should.” In this day and age, these timeless words of Jurassic Park’s Dr. Ian Malcom, played by iconic fan favorite Jeff Goldblum, may as well be the summary of just about every problem we face with technology and the environment. Decades of acting before thinking, certain that any advancement or new idea was bound to turn out well, has created some very nasty fallout we don’t know how to contain or fix, so much so that some are literally giving up on trying.
The newest technology under fire? Creative artificial intelligence meant to help generate both prose and art, as well as help programmers with boilerplate code. While the systems may be really impressive in some cases — although my personal experience using GitHub’s Copilot for complex logic and database scripts was underwhelming — and produce interesting results, the problem lies in how they’re trained. You see, what they ultimately produce isn’t new art, code, or paragraphs, but a remix of the material they were fed by their designers.
Anything that exists outside of the training sets may as well be in a different universe for the neural networks being trained, and even some things within those training sets are way too complex or abstract for their internal architectural models to break down and is rejected as utterly incomprehensible. With no other frame of reference or data to draw on, even the most advanced AI you can build today will just mix and match until what it composed resembles a piece of art, or writing, or code fed into it by its creators.
how we got virtual cryptids and mechanical copycats
If you saw the story of Loab, the creepy woman seemingly haunting generative AI networks like some sort of virtual cryptid, and wondered why neural networks would create the same face if you told them you wanted in inversion of the human form, you’ve encountered the problem at hand. The AIs in question fixated on the same set of images in their training because of a rogue parameter, images scraped by the tens of millions from public art libraries without permission or attribution, and now keep remixing the same face since that’s all they know.
In a far less haunting version of the same error, you have generative AI creating fantasy art that largely emulates the style of Polish illustrator Greg Rutkowski and very few others. Even worse, these networks now also allow for live, streaming art theft by completing images artists are in the middle of creating. All of this has artists fuming about their work being used to feed AIs that spit out rapid fire imitations and permutations of their styles and favorite subjects, making it far more difficult to stand out in the already crowded art spaces of social media.
Likewise, coders who spent enough time with Copilot are finding that the coding AI is recycling algorithms they’ve written, scraped from open source software after its creators gave GitHub blanket permission to gobble up anything in a public repository via an update to the platform’s terms of service. While it would be bad enough if that’s all it did, there are reports of Copilot’s tentacles potentially sneaking into private code on the side, so it’s output may be regurgitating proprietary code of companies paying GitHub to host their mission-critical assets.
the thankless task of feeding the algorithm
In other words, the awesome capabilities of creative artificial intelligence are using billions of hours’ worth of humans’ creative output lifted without compensation, virtually no attribution, and remixed ad nauseum, until the original works were lost in the terabytes worth of output trained to mimic it. It’s not so much that this AI will replace human artists, coders, or writers, it’s that the technology is exploitative and charges customers for the sum total of countless hours of work, study, and creativity effectively stolen from millions of experts.
At this point, almost every creative using the modern web to pursue passion projects or stand out from the herd in a hypercompetitive field will throw their hands up with exasperation. Not only do they have to produce endless reams of consistent content, it also needs to be tailored to what the almighty social media algorithm wants that day, and then it will be swept up for a machine that will use this content as fuel for its own versions of it by people who look at the web and say “ooh, free stuff, don’t mind if we do!” and charging for the end result.
We’ve ended up with an absurd opposite of the scenario once envisioned by techno utopians. Instead of letting the machines handle the grunt work then reap the financial rewards of their swift industrial output, we’re ending up doing creative grunt work for free, if not at a net loss, then paying to get remixes of it to include in new free consistent content. And we should note that as a cherry on top, this article may well become part of a training set of a generative AI for writers, completing this ouroboros of futility.