This is the first time in human history we’re seriously asking whether a product that billions of people use every single day was designed - on purpose - to make us weaker, to make us stupider, to make us more unhappy.
The Big Tech companies like Meta, X, ByteDance and Google, among others have built systems that quietly dictate what we see, what we feel, and what we pay attention to. Let’s not pretend that that is normal or desirable or good for humanity. Because it’s not.
The biggest, most powerful companies on earth know exactly where you are, what you want, what you fear, what turns you on, what keeps you up at night. They know this information to a degree you probably can’t even name yourself.
And now, they are standing in courtrooms, being asked a very simple question:
What did you do to us?
Not “what did users choose,” not “what did parents allow,” not “what did society become.”
What did you do to us? What did you design?
Because the way Facebook weaponizes your nostalgia isn’t an accident. The way X weaponizes your outrage isn’t an accident. The way Instagram weaponizes your insecurity isn’t an accident. The way YouTube weaponizes your curiosity isn’t an accident. The way TikTok erodes your attention span isn’t an accident.
The picture is increasingly clear: these platforms were designed with these outcomes in mind.
The changes these apps have made to how we live, how we feel about ourselves and one another were engineered the same way your car, or your refrigerator, or your wristwatch was engineered.
The infinite scroll. The push notification that hits your brain like a slot machine jackpot, that jab of dopamine right into your cerebral cortex hundreds of times every day. The algorithm that figures out, faster than you can ever hope to, what will make you angry, what will make you jealous, what will keep you staring at a screen at 2:15 in the morning when you know you should be asleep.
We were told these products were the future, a new frontier, a new economy with trillions of dollars just waiting to be made. But now juries of our peers are starting to look at these products and conclude that these are not just products. They are problems. They are bad actors - a malevolent force in our society.
If the dam really is breaking and we cross the line from “your misery is a personal problem” into “your misery is a corporate product,” we aren’t just talking about a handful of lawsuits. We’re talking about a sea change; the entire business model of the modern internet getting yanked into the light and stripped for parts.
Because if these platforms are responsible for addiction, if they knew how sick this was going to make us and they did it anyway - you can’t call that innovation. Call it what it really is - exploitation.
We are being studied. We are being shaped. Worse than that - we are being worn down. We are being ruined.
And here’s the part that should make you uncomfortable - it makes me uncomfortable
We like it.
We say we hate it. We complain about it. But we keep consuming something we know is bad for us, day in and day out, and the best we can do is admit that it feels good.
We pick up the phone and tap that app again. And again. And again.
So the question now isn’t just “are these companies guilty?” It’s bigger than that. It’s about shifting the narrative - this isn’t about us - it’s about them.
This isn’t just how the world works. It’s how it was made to work by cruel people.
And anything that was made this way can be unmade.
As juries weigh claims of addictive, destructive design, a bigger question is starting to emerge: what if this wasn’t an accident?





