Bad news: I’ve been thinking about AI again.
Anywhere we go online, we find ourselves squinting. Is that AI generated? We freeze, eyes narrowing at strange images, supposed gameplay clips, even songs. Halfway through reading a politician’s statement, we spot the infamous “it’s not X — it’s Y” construction, before swiping away in disgust.
All this squinting is starting to feel like a cognitive burden—like some ever-increasing percentage of our brains’ RAM is now constantly being allocated to the task of identifying human or robot provenance. Did that em-dash in the previous sentence trip you up? It slowed me down. I spent a few seconds debating whether or not to replace it because of the association that particular punctuation now has with ChatGPT-assisted writing. That’s despite the fact I’ve been using the trusty “—” for decades.1
I find myself squinting for hints of AI reliance constantly, even in my own writing, which is certified AI Free™. I’m not terribly offended when others get assistance from AI to write, because writing is hard and most people find it terribly painful. But I—maybe arrogantly—believe my own writing will always be “better” than AI generated writing, if only because the words I type are an authentic expression of my irreplicable, God-given human spirit.2
At this point I don’t trust anyone who doesn’t feel at least some cognitive dissonance when thinking about AI. Even if you lean optimistic (as I do), you’ve got to admit things are getting real weird, real fast. And the more bothered you are by AI’s omnipresence in media, the worse the cognitive burden of AI squinting becomes.
A quick example:
Last week’s newsletter about the overwhelming deluge of new game releases featured an interview with Wanderbot, a YouTuber with 500k subscribers who says he typically gets over 30 emails per day from indie game devs.
I’ve been thinking a lot about something Wanderbot told me about the effect his own AI squinting is having on his motivation for his job.
I’ll just quote directly from what he wrote to me:
The main thing I'm worried about is generative AI. Frankly, I hate everything about it, and I'll fully admit it's making me lose interest in a lot of media, games included. Seeing the sheer amount of game developers (indie and AAA alike) throwing artists, voice actors, writers, musicians, etc... under the bus to save a buck is infuriating, especially when those models were built upon those people's work. I've received emails for and rejected countless games over the past three years for using generative AI, and that number is rising alarmingly.
I'd be less bothered if developers properly labeled their usage of AI on Steam, but for every developer that does it feels like two more actively refuse to do so and it's making my efforts to avoid covering their games far more time consuming. It's not very fun having to scrutinize every game to this degree, and rather than wearing down my resistance to seeing AI in games, the increase of its usage is just pushing me away from games in general.
—Wanderbot, in an email to Push to Talk
This is a person feeling discouraged. He’s becoming less motivated to do the thing he’s dedicated the last decade of his life toward.
I don’t really know what the solution is here. But I’ll make two predictions:
Many people (maybe even a majority?) are going to feel basically unbothered spending time on an internet where a huge percentage of content is synthetic. Arguably, we’ve already been living in that world for a long time.
But another significant slice of the audience is ultimately going to decide to withdraw away to venues less exposed to synthetic content. This is already happening now as people withdraw their attention from platforms with For You feeds in favor of private group chat and Discord servers. But new venues are going to emerge.
This latter group will obviously include the self-identified anti-AI crowd, but I think the demand for these new spaces will be bigger than that. There’s only so much cognitive burden anyone can take. Soon, it’ll seem like a basic cognitive security practice to limit time spent on the open web.
What you really need is another farm fresh, pesticide free newsletter. So…
Here’s me in 2014 going full ChatGPT (eight years before it launched) while describing “Twitch Plays Pokémon” in a piece for WIRED:
It started just over a week ago when an anonymous programmer used Javascript and Python code to connect Twitch's chat functionality to a copy of the 1996 Game Boy game Pokémon Red, the first in Nintendo's mega-popular series. Anyone watching the stream could type commands—up, left, down, A, B—and the game reacts as if you'd hit the corresponding button on a Game Boy.
Simple, right? Now imagine 75,000 people trying to elbow you out of the way and press the buttons on your Game Boy. It's not actually you playing the game—it's a vast hivemind of which you are but one small part.
The single greatest thing ever written about the conflict between humanity and technology was written for a banquet speech by a Mississippi man in Stockholm, Sweden on December 10th, 1950.
Speaks to my own feelings as well. Especially the em-dash bit... I was getting feedback from my professors telling me to chill with the em-dashes years before OpenAI was founded, dammit! But it's such a useful tell that I find myself noticing it in like, generated Amazon reviews and such too. Hoping they overcorrect and remove it altogether so that people don't view me suspiciously. I've already caught myself removing them in reddit comments--something I never did, even when my writing instructors were telling me to cut down on em, lol.
Anyways, I can feel myself getting exhausted and the discouragement that Wanderbot describes is hitting me too. I'm not quite sure I'm withdrawing in the way you're describing, but I think it's probably going to happen soon. Especially because the reality is that I don't actually need to retreat from "content" writ large... just new content.
But I'm also realizing generative AI is not the only factor. For years I thought of shorts content as a simple evolution on the same upsides and downsides that online content has always dealt with. But in recent months I've actually spent time watching shorts and the thing that really strikes me is how unimportant "truth" is. And I don't mean agenda-driven misinfo or whatever, which of course always existed. Even if you're doing something as simple as clipping a TV show it seems like shorts rewards you most for chopping it up in a way that misrepresents the original material. Even people who are recording original work say openly that it's better to present it in ways that are misleading (e.g. to spark controversy or to frame it like they're on either end of some injustice). I was talking to a guy who edits for a big YouTube channel and he basically said that was the breakthrough for him, you literally just have to lie, in a way that is totally irrelevant to the normal YouTube editing process.
It's always been present (e.g. egregiously clickbait titles) but shorts seems to really be taking it to another level, both in terms of how constant the untruths are and because they're reaching way more people. People have predicted that "knowing what's true" would become exhausting for years... but I guess I always assumed that it'd only matter when it came to like, important political topics. But nah, turns out it really is just everything.
Damn I still love a good em dash.
I think your predictions are pretty bang on. I’m seeing it already.
And agree, one of the saddest things I saw in the last thing I wrote on AI: People in creative industries being forced to use it, and accept obvious mediocrity in the name of productivity and cost efficiency.
It’s so grim.