The Social Network is Broken: Why Facebook, X, Instagram and Bluesky Are Failing Us
Social media was once hailed as the great connector — a tool to bring the world closer, amplify unheard voices, and democratise communication. But today, many of the platforms that dominate our digital lives are doing the exact opposite.
Facebook, X (formerly Twitter), Instagram, and even newer players like Bluesky are increasingly failing to serve the communities they once claimed to empower. They’ve become engines of division, distortion, and discontent — and in my opinion, it’s time we take a hard look at what went wrong, and how to build something better.
Echo Chambers and Algorithmic Polarisation
One of the most pernicious effects of modern social platforms is the creation of echo chambers — self-reinforcing digital bubbles where people are only exposed to views that match their own. These echo chambers are not a byproduct; they’re often the result of algorithms designed to maximise engagement by serving users content they are most likely to interact with.
On X and Bluesky, this has created spaces overwhelmed by political extremism — both on the left and right. What begins as healthy debate often devolves into ideological warfare, harassment, and performative outrage. The platforms reward the loudest, most controversial voices, not the most thoughtful ones.
We’ve gone from “joining the conversation” to watching people shout past each other in algorithmic silos.
Instagram: Curated Perfection and Mental Health Harm
While X and Bluesky struggle with political toxicity, Instagram presents a different kind of damage — one masked by filters, influencers, and aspirational aesthetics.
Instagram’s algorithm prioritises content that performs well visually: beauty, wealth, luxury, status. Over time, this skews the feed into a highlight reel of unattainable perfection, especially for younger users.
From what I’ve seen, this is deeply corrosive to mental health, particularly for teenagers. Study after study links Instagram use to:
- Body image issues and eating disorders
- Anxiety and depression
- Low self-esteem
- Social comparison and FOMO
Meta (Instagram’s parent company) has known about these effects for years, as revealed in internal whistleblower documents. And yet, the business model relies on it — because discontent drives clicks, and clicks drive revenue.
Facebook: Ad Farm, Data Broker, Digital Decay
Facebook’s fall from grace has been slow, but steady. Once the world’s town square, it is now largely seen as a boomer wasteland, rife with misinformation, political spam, and relentless advertising.
Its problems are numerous:
- News feed pollution: Ads, clickbait, and recycled viral content drown out meaningful interactions.
- Aggressive data collection: Facebook's business model is built on surveillance capitalism — profiling users across the web to serve hyper-targeted ads.
- Platform decay: Pages are ghost towns. Events are cluttered. Groups are politicised. The original social graph — built on real-world connections — has been replaced by algorithmic manipulation and engagement hacks.
For many, Facebook is no longer where people connect — it's where they’re harvested.
The Common Thread: Engagement > Humanity
Despite their different aesthetics and audiences, these platforms share a fundamental flaw: they optimise for engagement, not well-being. For ad-driven models, your time and attention are the product. The longer you stay, the more you scroll, the more money they make.
This has led to design choices that prioritise:
- Outrage over nuance
- Addictiveness over wellbeing
- Exposure over consent
- Clicks over connection
And the result is a digital ecosystem that’s increasingly toxic, performative, and mentally exhausting.
What Might an Ethical Social Network Look Like?
The future doesn’t have to be like this. We can design something better — something human, ethical, and intentional. Here’s what that might include:
1. Algorithmic Transparency and Control
Users should know why they’re seeing content — and have the ability to customise or disable algorithmic feeds. Content should be sorted by relationships, interests, or chronology — not just what keeps people scrolling.
2. Opt-In Discovery
Instead of forcing viral trends or recommended posts, users should be able to opt in to new content discovery. This reduces polarisation and gives people more control over their digital environment.
3. No Data Exploitation
A privacy-first model where data isn’t sold, tracked across the web, or used for invasive profiling. Monetisation can come from subscriptions, tipping, or ethical advertising with strong consent.
4. Real Moderation and Community Standards
AI can help detect abuse at scale, but human moderation is essential. Clear community guidelines, transparent appeals, and context-aware enforcement should be standard.
5. Healthy Defaults
Limit endless scrolling. Encourage breaks. Show real people and real conversations — not just high-performing, hyper-curated content.
6. Speech with Accountability
Freedom of speech is vital — but it doesn’t mean platforms must amplify harmful content. Ethical networks should focus on freedom of reach, not blanket amplification. Misinformation, hate speech, and incitement should be deprioritised, not protected under the banner of “engagement.”
The Path Forward: From Digital Extraction to Digital Stewardship
In my opinion, we don’t need another “next big platform.” We need a shift in values.
We need platforms that prioritise mental health over monetisation, meaningful connection over metrics, and accountability over virality.
Social networks once promised to bring us together. Now, they too often pull us apart — by algorithm, by ideology, by insecurity. But if we’re brave enough to rethink the design from first principles, we can still reclaim what made them powerful in the first place.
Connection. Understanding. And a shared digital space worth being part of.