6 Comments
User's avatar
Omar's avatar

Speaks to my own feelings as well. Especially the em-dash bit... I was getting feedback from my professors telling me to chill with the em-dashes years before OpenAI was founded, dammit! But it's such a useful tell that I find myself noticing it in like, generated Amazon reviews and such too. Hoping they overcorrect and remove it altogether so that people don't view me suspiciously. I've already caught myself removing them in reddit comments--something I never did, even when my writing instructors were telling me to cut down on em, lol.

Anyways, I can feel myself getting exhausted and the discouragement that Wanderbot describes is hitting me too. I'm not quite sure I'm withdrawing in the way you're describing, but I think it's probably going to happen soon. Especially because the reality is that I don't actually need to retreat from "content" writ large... just new content.

But I'm also realizing generative AI is not the only factor. For years I thought of shorts content as a simple evolution on the same upsides and downsides that online content has always dealt with. But in recent months I've actually spent time watching shorts and the thing that really strikes me is how unimportant "truth" is. And I don't mean agenda-driven misinfo or whatever, which of course always existed. Even if you're doing something as simple as clipping a TV show it seems like shorts rewards you most for chopping it up in a way that misrepresents the original material. Even people who are recording original work say openly that it's better to present it in ways that are misleading (e.g. to spark controversy or to frame it like they're on either end of some injustice). I was talking to a guy who edits for a big YouTube channel and he basically said that was the breakthrough for him, you literally just have to lie, in a way that is totally irrelevant to the normal YouTube editing process.

It's always been present (e.g. egregiously clickbait titles) but shorts seems to really be taking it to another level, both in terms of how constant the untruths are and because they're reaching way more people. People have predicted that "knowing what's true" would become exhausting for years... but I guess I always assumed that it'd only matter when it came to like, important political topics. But nah, turns out it really is just everything.

Expand full comment
Ryan K. Rigney's avatar

Could not agree more, and well said. Such a massive percentage of the internet was already misrepresentative, dishonest, or fake in other ways.

We've probably had much too high of a tolerance for it for too long.

Expand full comment
Harrison Polites's avatar

Damn I still love a good em dash.

I think your predictions are pretty bang on. I’m seeing it already.

And agree, one of the saddest things I saw in the last thing I wrote on AI: People in creative industries being forced to use it, and accept obvious mediocrity in the name of productivity and cost efficiency.

It’s so grim.

Expand full comment
InGameScientist's avatar

So instead of squinting our eyes at the amount of information like we did for the Information Age, now it's detecting AI slop? I find myself doing double-takes sometimes so it tracks! (And I have to stop using em-dashes too even though they are very useful :(

Expand full comment
Ryan K. Rigney's avatar

I feel like it's >95% likely the next version of ChatGPT avoids em-dashes at all costs just because of its current status as a tell-tale sign of AI use. IDC man I'm gonna just use it either way lol

Expand full comment
Gareth's avatar

Thanks for writing this Ryan (not GPT, or Claude or Grammerly)

I think your predictions about humans removing themselves to "safer" more authentic spaces is a very likely scenario.

Already when I look at LinkedIn as a particularly insidious example of what you are referring to with regard to AI squinting. I worry that all the experts on their are simply sharing their AI written opions and similarly AI written responses.

You would think on a platform where everyone is trying to monetize their opinion and build clout, they would be more willing to share THEIR opinion.

Expand full comment