NYMag, Sam Altman, and the Misconception of Creative Essence
I finally got around to reading NYMag's article on OpenAI CEO Sam Altman and couldn't help but find it puzzling. Tucked about 4/5 of the way in, the writer subtly interjects:
He has dedicated himself to building software to replicate -- and surpass -- human intelligence through stolen data and daisy chains of GPUs.
Stolen data? Really?
As someone who transitioned from being a traditional 2D animator at Disney to a 3D CG animator, I see a pattern of misconceptions that once again need correcting.
What we're talking about is laying claim to the creative output of millions, billions of people and then using that to create systems that are directly undermining their livelihoods.
Let's set the record straight. GPT, or General Purpose Transformers, are trained to predict the next word in a sentence. They're not magically absorbing the intent or the essence of human creativity; they're making educated guesses based on statistical probabilities. It's like a high-speed version of playing "connect-the-dots" with words.
When I worked at Disney, I used to pin up sketches and animation drawings from the early films. I would use them as inspiration, and sometimes even--GASP--trace over them directly in order to capture a particular expression or hand-pose. 😱
Do I somehow owe those artists royalties? No. Because the end result was made manifest by my actions, and my particular take on the final product.
Generative AI is inspiration at scale.
To truly understand where GPT lacks, we need to delve into story theory, something I've dedicated years to analyzing. A well-crafted story isn't just a sequence of events (text); it's a complex interplay of underlying motivations, themes, and emotional journeys (subtext).
That's one of the reasons why my app is named Subtxt.
GPT can mimic the text but fails miserably at capturing the subtext. Its stories read like empty shells because they are not constructed upon the foundational elements that make stories resonate with us. There's no meaning, no subtext—just superficial mimicry.
Even Sam Altman stumbles into this pitfall:
Everything 'creative' is a remix of things that happened in the past, plus epsilon and times the quality of the feedback loop and the number of iterations.
This cynical "Everything is a Remix" sentiment simplifies the complex process of inspiration and creation into a mathematical formula. Creativity isn't about replicating or remixing the past; it's about interpreting and communicating a unique vision or message.
We are all part past, present, and future.
But we are also equal part progress.
Many have scrutinized Altman's words without understanding his intent. What he's really trying to get across is not a devaluation of creativity but a description of the algorithmic process behind GPT.
A mathematician interviewed for the article:
Using C for things creative, R for remix of past things, Q for quality of the feedback loop, N for number of iterations, is he saying C = R + epsilonQN or C = (R + epsilon)QN?
This misses the point. Altman wasn't framing creativity as a cold, hard equation but as a series of intricate, often ineffable, relationships.
And the relationships between things is just as important as the cause-and-effect of one thing leading to the next.
You know. Like a story.
The NYMag piece aims to paint Altman as a threat. But the real fear comes from our insecurities about our own abilities and the future of creative professions in the face of advancing technology. Are we afraid that AI will reveal us as frauds?
Here's the deal: When a super-intelligence eventually arrives, I'll still be here, writing and creating stories. Why? Because I'm not scared of not being the best. I'm already not the best, and there are far superior intelligences (of the organic type) already writing better than me.
I'm okay with just being myself, doing the work, and expressing my unique and deeply personal point-of-view.
AI like GPT is a tool, much like a pencil or a printing press. It doesn't replace human creativity; it augments it. What we should be focusing on is not the fear of AI but the power of human intent, the driving force behind true creativity.
Intent cannot be captured by a data farm. Stephen King gets this, and I can only help to wonder why so many other writers fail to follow along in his seasoned and experienced point-of-view.
AI is just a projection of our own humanity. We get out what we put into it--the same way it has worked since the first storyteller sat around a burning fire with a captive audience.
So let's stop getting bogged down by the words and start appreciating the intent behind them, because that's where the real story lies.
Don't miss out on the latest in narrative theory and storytelling with artificial intelligence. Subscribe to the Narrative First newsletter below and receive a link to download the 20-page e-book, Never Trust a Hero.