Fork in the Road: Real People or Synthetic Comfort
As AI fills the internet with synthetic comfort, the real question is whether we choose convenience or real human connection.
We are at a fork in the road for the internet.
One path is smooth, fast, and endlessly accommodating. It is optimized for engagement, generated on demand, and padded with synthetic voices that sound reassuring enough to keep us scrolling. The other path is slower, messier, and far more human. It requires participation, vulnerability, and something we’ve slowly stripped out of digital life: skin in the game.
Over the past several months building intori, I’ve become increasingly convinced that this fork is not theoretical. It’s already here.
The Rise of Faux Communities
We talk about “community” more than ever, yet most online spaces today feel hollow. They are populated, active, and algorithmically well-tuned, but they lack consequence. You can join, leave, perform, disappear, and reappear without anything really being at stake.
These are faux communities. They look alive but don’t ask anything of us. No commitment. No accountability. No cost to being wrong, dishonest, or extractive.
The problem isn’t that algorithms help us find people. The problem is that made-to-order social graphs with perfectly tailored feeds and endlessly reshuffled connections aren’t reflective of real life. In reality, our relationships are shaped by friction, time, shared interests, and mutual investment.
When everything is reversible and nothing is earned, connection becomes content.
AI Slop and the Value of Authenticity
Recently, Adam Mosseri, the head of Instagram, shared a 20-image deep dive about the coming flood of AI-generated content and what many are now calling AI slop. His point was not that AI content is inherently bad, but that authenticity is becoming the scarce asset.
When everything can be generated, what matters is what can’t be faked.
This resonates deeply with what I’ve been feeling while building intori. We are entering an era where synthetic output will vastly outnumber human signal. The danger isn’t misinformation alone, it’s meaning dilution. When real human expression is drowned out by automated noise, we lose the ability to tell what actually matters.
No Skin in the Game, No Signal
A core issue with today’s digital social systems is the absence of skin in the game. Likes are free. Follows are free. Opinions are free. Even identities are increasingly fluid or disposable.
But in the real world, relationships have cost:
- Time
- Reputation
- Emotional risk
- Opportunity cost
When those costs disappear, so does trust.
This idea has been echoed for decades by thinkers like Jaron Lanier, who has repeatedly warned that systems extracting value from humans without compensating them degrade both individuals and society. He’s argued that we need an honest human digital economy – one where real people are recognized, rewarded, and meaningfully represented, rather than flattened into data exhaust.
“If we cannot find a way to make digital dignity economically viable, we will continue to hollow out what it means to be human online.”
— Jaron Lanier (paraphrased)
The goal isn’t nostalgia for a pre-internet past. It’s progress toward systems that respect human input as something scarce and valuable.
The Upside Down Moment
Watching the internet evolve right now feels eerily similar to watching Stranger Things.
In the show, the “Upside Down” isn’t a different world, it’s a distorted mirror of the real one. Familiar shapes, inverted. Life drained out. Something hostile quietly spreading underneath everything we recognize.
AI-generated personas. Synthetic influencers. Auto-reply friendships. Algorithmic communities spun up and torn down without consequence.
This is our Upside Down.
The fracture isn’t just technological, it’s social. A split between real human connection and synthetic comfort. One path leads to a world padded with generated empathy and frictionless belonging, even as the underlying social fabric weakens. The other path is harder, but alive.
Why I’m Optimistic
Here’s the hopeful part.
Despite everything, we’re seeing early signs of resistance and rebuilding.
Networks like Farcaster are experimenting with identity, portability, and composability in ways that make it harder to fake participation. The Base App ecosystem is pushing toward real ownership, not just of assets, but of presence and reputation.
These systems aren’t perfect. They’re early. Sometimes clunky. But they point toward something important: social spaces where being human actually matters again.
That’s the future intori is being built for.
Not a network that manufactures community, but one that helps surface real people through shared context, stated preferences, and earned signals. A layer that treats human data as something to be respected, not strip-mined. A place where connection is discovered, not fabricated.
The Positive Side of the Fork
We don’t need a societal stopgap made of synthetic personalities and disposable belonging. We don’t need to retreat into a world where everything feels comforting while slowly falling apart underneath.
There is another option.
A future where:
- Real people stand behind their words
- Relationships have context and continuity
- Digital identity carries weight
- Human signal is valued over infinite noise
This fork in the road is uncomfortable because it asks something of us. Participation instead of consumption. Presence instead of performance.
But that’s the side worth choosing!
Because on the other side of this moment isn’t a quieter internet. It’s a more honest one. And that’s where real connection, creativity, and progress still live.