Earlier this year I traveled to China (and ended up detained by Guangzhou police for 9 hours while 7 weeks pregnant, but that’s a story for a different day 😅).

I got to experience how the full-on surveillance looks like. Your every financial move is tracked through Alipay. CCTV cameras are everywhere. Train journeys recorded with your face scanned at the luggage check and gates opened via passport instead of tickets. Airport-style scanners at metro stations. All under the label of “convenience” and “safety”.
I asked some locals – naive of me, I know – if they felt free. They all said almost the same thing, as if reciting by heart:
“Yes, you’re free in China, as long as you obey the law.”
But who defines what the law is? In China, the law can be whatever the government decides tomorrow. Your social credit score drops for jaywalking or criticising policy online. That’s not freedom with guardrails. That’s a panopticon with extra steps.
But at least in China, you know you’re being watched.
Here in the West, we’ve figured out something more insidious. We don’t force surveillance on people. We make them buy it and be happy about it. We make it fashionable. We make it convenient and entertaining. And then we act surprised when Meta’s stock jumps after launching Ray-Ban smart glasses that turn every user into a walking data collection unit, even against the will of many who prefer to stay anonymous.
The media already claims “The future is here!”. Influencers are queuing up, horny for how much easier it will be for them to record more ‘live’, rough content and never miss a bit. And I’m watching this, thinking, have we learned nothing?
True, it’s genius and evil in its simplicity. Phones have friction. You pull them out, use them, put them away. There’s a ritual that reminds you you’re recording. You still feel at least slightly self-conscious being that person filming in public.
Glasses don’t work like that. You put them on in the morning and (in theory) forget they exist. Which is exactly what Meta’s betting on. They don’t want you to “use” these glasses occasionally. They want you wearing them 12-16 hours a day. They want you to forget you’re wearing a recording device, because the moment you forget, everyone around you becomes commodisiable content.
And I think we should not call “a camera” something that is 5 microphones picking up conversations outside your field of vision. Let’s call things as they are – Meta’s turns you into a mobile surveillance unit, except instead of Big Brother forcing this on you, you paid $299 and brag about it on Instagram. Devil wouldn’t come up with anything better. So it had to work. There was no other option.
I remember when Facebook asking for your real name felt invasive. Now we voluntarily tag our location, friends, meals, thoughts (yes, even I am guilty of it!). We’ve been conditioned to comply and brain washed little by little – or should I say – slowly getting boiled like this proverbial frog. Innocent shares of who are you in relationship with, your hobbies, accomplishments. Then there came the Instagram and live stories – but still patched with your polished thoughts and the delete / edit option. But you’re ready for the next step of intimacy – now you might as well share the rough ones straight from your eyes (and soon – the brain itself). Because why the hell not? It’s far easier to do…
So Ray-Ban Meta isn’t really a leap. It’s just the next logical step we’ve been training for. For years, Meta convinced us “sharing is caring.” Now they’re telling us “constant recording is convenience.” Except this time we’re not recording ourselves – we’re recording everyone around us. Without their consent. Without their awareness.
In the meantime mission: “Divide society and polarise it” – has been accomplished.
Social media have been influencing how and what you think. It bubble wrapped your opinions, enforcing your confirmation bias and making you believe that your truth is the only truth. That there is no longer a common denominator with ‘the others’. That there is no dialog and willingness to unite anymore. It all feels like a lost cause. It feels like the world is more and more an unsafe, chaotic, unpredictable and illogical place to live. Each to his own.
It’s harder to fight united society. Much easier to control one, that is at odds with themselves.
Sure, there is still some mini rebellion; hospitals are already firing nurses for wearing those “X-Ray” Bans. Hair salons are banning them. Gyms are putting up signs. But that’s cute. As if workplace policies could stop this. Wait until these things are everywhere and you have to assume every person wearing sunglasses is potentially recording you. Because I’m sure Samsung, Xiaomi and the others are already plotting releasing their own versions.
At this point, everyone’s debating whether you can disable the LED indicator light. As if that’s the issue. As if the problem is just that we need a better recording indicator.
Nobody’s asking why we’re okay with Meta having real-time access to everything we see (and soon – think). It’s ironic how Zuck got away with Cambridge Analytica scandal with no serious repercussions, and now released the Kraken in his full power – with far more destructive consequences on society. But then again, irony is on us because nobody seems to give a damn.
This is bigger than glasses
I’ve been meaning to write this since I interviewed Nita Farahany, author of The Battle For Your Brain, on my Are You Human podcast.

She warns that we’re losing our last space free from surveillance and commercial influence – our mental privacy. But the funny thing is that we’ve been warned. We’ve si-fantasised it in books and movies far and wide; Minority Report (2002) – personalised ads following you based on retinal scans. It looked absurd, yet, now Meta’s glasses track your gaze to serve you “better” content.
Then we had The Circle (2017) – “Secrets are lies, sharing is caring, privacy is theft.” The company made transparency mandatory. Well, Meta’s been pushing this narrative for years, just with better PR. And of course we had Black Mirror: The Entire History of You—everyone has an implant that records everything they see. We’re getting there, except it’s voluntary and we paid for it.
And how could I forget about 1984 book. But what Orwell didn’t predict is that we’d willingly buy the telescreens because they play music and show us cat videos.
So we know what’s coming.
Cognitive liberty – the right you didn’t know you were losing
Until recently, you at least assumed you could think a private thought. That you had a right to mental privacy. Maybe not freedom of expression, but freedom of thought.
That’s under threat now. Farahany introduces the concept of “cognitive liberty”—the right to self-determination over your own consciousness. Sounds obvious, except it’s not legally protected anywhere. You have (theoretically) freedom of speech, but not freedom of thought. Because until now, thoughts were private by default.
But what happens when:
- Your employer requires a “focus headband” for remote work to ensure productivity?
- Insurance companies offer discounts if you share mental health data from your meditation app? (bah, they already do when you let them track your Strava / gym workouts)
- Courts demand brain scan evidence in criminal cases?
- Dating apps use neuro feedback to determine “authentic” attraction?
You can refuse, sure. But can you really? The same way you can “refuse” to have a smartphone. Technically possible, practically career suicide.
This isn’t science fiction anymore
Companies are already using EEG headbands to monitor workers’ attention levels. Chinese schools tested brain-tracking headbands on students to measure focus during class. The data went to teachers, parents, administrators. Imagine being 12 years old and having your teacher know you were daydreaming before you even realised it yourself.
Tesla and other companies have experimented with fatigue-detection systems for drivers and factory workers. Sounds reasonable because safety first, right? Except the same technology that detects fatigue also detects frustration, boredom, disagreement. Your employer doesn’t just know you’re tired. They know you’re thinking about quitting before you’ve decided to quit.
Today, fMRI studies can reconstruct images people are looking at just from their brain activity. Researchers at UC Berkeley created a decoder that can translate brain activity into text – they can literally read what you’re trying to say before you say it. The technology that helps paralysed people communicate is the same technology that could be used to interrogate suspects, monitor employees, or profile citizens.
Dual-use problem on steroids. Every neurotechnology developed for good can be weaponised for control. (As it happens with any powerful technology / tool).
Are you really free-willed?
If AI can predict your choices before you’re consciously aware of them (and studies show it can, seconds in advance), then who’s really making the decision? If a recommendation algorithm knows you’ll click something before you know you want to click it, and serves it to you at exactly that moment. Did you choose, or were you nudged?
Now scale that to neural data. Your brain signals indicate you’re about to get angry in a meeting. Your “productivity assistant” AI intervenes with a calming notification. Helpful? Or are you being pacified? Equilibrium-shot like? Are you still you if your emotional responses are managed by an algorithm?
Farahany asks; who owns your emotional data? If an app detects you’re depressed, can it sell that information? Can your employer access it? Your insurance company?
The answer right now is a horrifying…
“…it depends.”
But when big money comes into play, those companies will quickly find the way to make you an ‘irresistible’ offer and make you sign the devil’s cirograph with no second thoughts.
The surveillance business model
Meta already tracks your eye movements, micro-expressions, how long you linger on an ad. They know what you’re interested in better than you do. A Meta app installed on your phone listens to your conversations and then shows you relevant ads. Now imagine they have access to your neural data. The most intimate thoughts. They don’t need to guess what you want – they can measure your emotional response in real-time and adjust accordingly. Farahany calls this brain transparency. Tech companies frame it as personalisation. “We just want to give you better experiences”… but better for whom? A brain transparent to corporations is a brain that can be manipulated with surgical precision.
What happens when these glasses and wrist bands become popular? When you have to assume every person wearing glasses is potentially recording you? How does that change our behaviour in public spaces? The spontaneity of conversations? Our sense of safety?
Google Glass was killed socially. People called users “glassholes” and openly shamed them. The “Glasshole” phenomenon worked because we still had social immunity to this surveillance.
Some years later, we’ve been conditioned. Instagram Stories normalised constant recording. I used to be shocked seeing videos from funerals, delivery rooms and other most intimate moments – now the privacy line is blurred. TikTok made filming strangers acceptable. Meta’s just productising what we’re already doing.
The Ray-Ban branding is genius. It’s fashion, not tech. Nobody calls you a Glasshole when you’re wearing “designer sunglasses.” Meta claims these glasses help the visually impaired navigate. Ok… they also said Facebook would “connect the world.” Also technically true. They didn’t mention connecting the world meant commodifying every interaction, every thought, every glance for ad revenue.
Let me guess… nah, let me read your mind instead.
Ray-Ban Meta is training wheels for neural surveillance. They’re done an amazing job normalising constant recording, constant data collection, constant monitoring. Not of strangers – of yourself.
Because that’s the next step. Meta isn’t just interested in what you see. They want to know what you feel about what you see. How long did your gaze linger? Did your pupil dilate? Did your heart rate increase? They want the internal data, not just the external.
And Meta’s their wristbands that detect neural connections are perfect for that. We’ll accept it too. Because they’ll call it “immersive experiences” or “personalised wellness” or “mental health support.” They’ll point to the person whose depression was caught early by their neural monitoring app. They’ll show us the student whose focus improved with real-time neuro feedback. They’ll make it sound insane to refuse.
AI would not care if you lived or died
Just look at what’s happening with Friend AI. They spent over $1 million blanketing New York with ads for their $129 AI wearable – a necklace that listens to you constantly and texts back through an app. It builds a personality over time using Claude, sending encouragement, commentary, unprompted messages designed to feel like a “friend.” Because who needs a real human friend when you have a ‘thing’ that will never criticise, always cheers for you and is there for you 24/7.

Some New Yorkers started vandalising them with “surveillance capitalism” and “get real friends.” It’s comforting to see that at least some people are still fighting back. But for how long?
Don’t look up (unless there is something worth recording)

Don’t look up is a brilliant movie btw. Talks about two low-level astronomers must go on a giant media tour to warn mankind of an approaching comet that will destroy planet Earth.
Farahany proposes “neurorights” – legal protections for mental privacy, similar to human rights. Chile actually passed a constitutional amendment protecting neuro-rights in 2021. Spain followed. It’s something.But we couldn’t protect basic data privacy. GDPR is toothless when the alternative is digital exile. How exactly are we going to protect neural data when the technology is advancing faster than regulation can keep up?
Nita spends significant time on China’s “sharp eyes” program – 400 million surveillance cameras, many with emotion detection. But she points out that democratic governments aren’t much better. The NSA’s mass surveillance, predictive policing algorithms, the UK’s facial recognition trials – we’re just slower and more PR-conscious about it.
The difference is that authoritarian regimes are honest about control. They don’t pretend it’s for your benefit. In the West, we get surveillance packaged as wellness, productivity, convenience. Your Brain-Computer Interface isn’t control. No no, it’s self-improvement. Your emotion-tracking app isn’t surveillance – it’s mental health support.
The panopticon doesn’t need to watch everyone all the time. It just needs everyone to believe they might be watched. That changes behaviour just as effectively.
It’s just pie in the sky
Maybe instead of asking “how do we protect ourselves?”, we should ask “why are we so willing to give it away?”. My grandparents’ and my mom’s generation in Poland knew what surveillance looked like because it was obvious – secret police, wiretaps, kind neighbours (informants). They could resist because the threat was visible.
Our surveillance is invisible, voluntary, and damn addictive. We pay for it. We review it five stars. We recommend it to friends.
Maybe privacy isn’t dying because of evil corporations or authoritarian governments. Maybe it’s dying because we don’t actually value it enough to make sacrifices for it.
People in China said they’re free “as long as you obey.” But at what point does modifying your behaviour to avoid surveillance become the same as not being free at all? If you can’t have a private thought without wondering if it’ll be decoded, commodified, or used against you – are you free to think?
That’s the real battle for your brain. And most people don’t even know they’re in it.

