I know it’s cheeky of me to ask for a like before you had the chance to read or listen to this post, but… you’re dangerously close to that 💖 button right now. Do it! The algorithm really likes it when you hit that like button…
Ever finished a book or binged a series so immersive that re-entering the real world feels like you've wandered into someone else's story?
I just wrapped a five-book series and I don't know what's real anymore.
I catch myself scanning the streets for these fictional characters I spent hours reading about. I feel lost, stuck between chapters of reality and fiction, grieving for a world that never existed. It's a very odd feeling, and I'm used to feeling odd.
Apparently, this phenomenon is called having a 'book hangover.'
It's amusing—and slightly disconcerting—how ink on paper can blur the lines between what's real and what's not, especially considering that I knew it was fiction...
But what if I hadn't known?
What if I had gone into this series thinking it was a memoir? Would it have mattered? Would my "hangover" have been any worse? Better?
What happens when the line between what's real and what's artificial isn't just smudged by novels and the movies they're based on, but by the very content we consume daily?
I'm obviously hinting, not too subtly, at social media. The ultimate content machine.
Social media is a new technology that promises to connect us, to give us a voice, and for a while, it felt like it did. We reconnected with old friends, shared photos of our lunches (whether anyone asked for them or not), and signed petitions online.
But then the cracks started to show.
We began to see how social media could erode privacy, turning our personal lives into data points for targeted ads. We began to see mental health issues surging amongst our young. Anxiety, depression, and feelings of isolation grew, ironically, in a medium designed to connect us and give us a voice.
We were so enthralled by the novelty and convenience that we overlooked the long-term implications. By the time we recognized the downsides, social media had already sown itself firmly into the fabric of modern society, and unraveling it proved complicated to say the least.
Supposedly we now know better. We have studies we ignore and documentaries we don't watch because the warnings have become background noise. We tell ourselves we've learned not to trust ads and influencers with their meticulously curated lives. And even though we double-tap their photos and sometimes take their advice, we know—at least intellectually—that it's all a façade.
But every so often, reality cracks the veneer. A beloved influencer caught faking a perfect lifestyle, a celebrity embroiled in scandal, a public figure exposed for dishonesty.
David Grohl and his secret baby, P. Diddy with his sex trafficking scandal, Lance Armstrong, and his performance-enhancing drugs. But these examples are neither here nor there. Who cares about politicians and other very public figures? They are too many degrees far removed to impact most people's lives in any meaningful way.
But what happens when these figures aren't so removed?
What about your favorite podcast host who feels like a close friend during your morning commute?
Your go-to writer whose articles resonate deeply with your own experiences?
The musician whose lyrics seem to narrate your life?
The Instagram followers you've never met in person but occasionally interact with in the comments section?
What about that Tinder match you've been texting—the one who seems to tick every box?
What about that stranger on Reddit you keep having banter with?
What about your child's Roblox friend?
What about your BetterHelp therapist?
What would happen if they did something so egregious it shook you to your core?
Or worse, that they were never real at all?
How would you feel? Confused? Betrayed?
What would it do to your sense of reality when connections you thought were genuine turned out to be fake?
I read more than 5—but less than 20—articles on Google's new NotebookLM Audio Overview feature—it's good, scary good.
If you've been living under a rock, or worse, out there in the real world, NotebookLM can now create a podcast based on your writing. The Audio Overview feature generates a surprisingly lively discussion between two AI hosts based on your text and documents. It sounds very blah until you actually hear the way it sounds.
It almost passes the "Can you tell it's AI" test.
So much so that, had I lacked the context, I probably wouldn't have been able to tell I'm not actually listening to real humans talking. The dialogue is engaging and full of the normal nuances that you’d expect from a natural speech, like laughter and slight mispronunciations.
Scary.
It's not only a cool gimmick (like most of the products build on AI have been) it's actually, dare I say... useful.
Apparently the motivation behind its creation was to "allow users to absorb information in a conversational manner, especially auditory learners." That's a whole lot of words to describe content creation - because everything is content now.
"I see your goosebumps and I raise my own...” I wrote on
‘s post on the topic. “Not sure if it's from a state of awe, or fear. Probably the latter.""I mostly find this replacement of humanity to be ever more terrible." someone replied to my comment.
"I no longer had confidence that I could tell what was real and what wasn't." A reporter for TechRadar wrote.
Welcome to the looming reality of the "AI Hangover."
Just as a book hangover leaves you disoriented after immersing yourself in a fictional world, an AI hangover is the disillusionment that follows the realization that your digital interactions, may not be with real people.
It's the emotional whiplash from investing time, trust, and even affection into relationships that, upon revelation, feel hollow and deceitful.
This isn't some dystopian future... it's happening right now.
It starts slowly at first.
Under the guise of efficiency "real" influencers use LLMs to punch up their words and arguments, they churn out more "content" at a pace no human could sustain alone, just to stay afloat in a saturated market filled with other influencers doing exactly the same.
And then it gets more aggressive.
They clone their faces, voices... minds—for efficiency, of course. An AI avatar who doesn't need lunch breaks or dental plans. Something that smiles, nods, and delivers hot takes without a hair out of place—or a pulse. And if you think I'm exaggerating please remember we already have all the tools to do it, they just haven't passed the "can you tell it's AI test" yet.
But what will happen when they do pass the test?
Here we are again, 20ish years post social media, standing on the precipice of another technological "revolution," squinting into the bright lights of AI, wondering if this time we'll remember to wear sunglasses.
An average American adult spends around 8 hours a day actively engaging with some form of screen. Gen Z averages around 9 hours a day. And newborns are presented with a screen as early as 4 months old.
Technology is more common than... flowers.
Not enough (but also too many?) studies have been done on the topic of screen-time, internet-time and social-media time. And no matter how you roll the dice, the same advice stands—less is more. Yet contrary to every advice ever, the trend is moving in the opposite direction, every year, we use more technology, for longer, younger.
The internet has made it dangerously easy to form parasocial relationships. I mean, how can you not feel close to a podcast host when you're listening to them for 2, 3, even 4 hours a week? That’s probably more than you talk to your mom.
These artificial connections can spark real emotions.
You wouldn't be able to tell just based on the speed with which we're charging ahead, but AI is still a very young technology. And yet, we’ve already handed it some heavy responsibilities—ones that we don’t even fully understand.
And sure, the rusty gears of academia are ever so slowly turning out papers to discuss its societal impacts, but we’re in such a rush to see what AI can do for us, that we’re not stopping to think about what it could do to us.
The "AI hangover" is the experience of waking up to a world where the lines between the real and the artificial are blurred without recognition.
What happens when the curtain is pulled back, and the wizard is revealed to be an intricate web of code?
We might start to question everything:
Can we trust our own perceptions? If an AI can mimic empathy, humor, and authenticity, how do we discern the real from the fabricated?
What are the ethical boundaries? Should there be regulations requiring disclosure when we're interacting with AI? Much like we have on cigarette packs?
Who holds responsibility when artificial interactions cause real emotional harm?
How do we protect ourselves and our loved ones? In a world where AI can imitate human connection, digital literacy becomes as essential as knowing how to read.
But perhaps the more profound question is:
What does this mean for the future of human connection?
Will we become so accustomed to artificial interactions that we start to prefer them over the messy, unpredictable nature of real relationships? Or will the inevitable AI Hangover push us to seek out and value genuine human connections even more?
While technology can imitate the form of human connection, it cannot replicate the depth and nuance of a relationship built on mutual understanding and shared experiences.
In the end, it's up to us to ensure that in our pursuit of innovation, we don't lose sight of the irreplaceable value of real human connections.
What about you? What was the last book you read that gave you a hangover?
I didn't know about that podcast generator. I see all the benefits of AI being used to improve access to content in so many ways but none of those ways is being used?! Imagine if you could scan a QR code in a museum and someone artificially engaging has a thought provoking conversation about the pieces you're looking at, and their artists. I know, we're taking work away from Actual people who Know their stuff, but they haven't done this yet.
The current, and seemingly unstoppable future, landscape of AI is a product of a MUCH larger issue that you touch on. In a society that is so driven by productivity with no interest in the true human experience, we resort to content to fill the void of entertainment and socialization. AI has become the answer to the question of "how can we do more?". I don't think the majority of people WANT AI to replace the human experience for everything, but it's a decision that's being made for us.