Our perceived selves are not our own. Nor, perhaps, our own perceptions.

The global workspace theory was introduced in 1980 by cognitive psychologist Bernard Baars, explaining how our brain states of attention and awareness "involve a broadcasting of unconscious information into conscious information"¹.

Soon after, French neuroscientist Stanislas Dehaene expanded upon the theory with an experiment involving flashing words and shapes on a screen, uncovering the mechanics of subliminal messaging: the message was received by the brain but not broadcast to become conscious information¹.

Dehaene achieved this by appending a shape to the screen to be flashed on either side of the desired subliminal word. This caused the study's participants' brains to process the information from the subliminal word without being consciously aware of it¹.

Understanding this, and imagining a bit further, it becomes possible to see how an advertisement, business, individual, or government could influence your own perceptions of yourself and the world around you.

If an advertising company has the technological know-how to implement ultrasonic tracking (this one, Intel patent, Tor users not safe either) why couldn't they implement a similar subliminal audio system or other more advanced subliminal systems? How would we know? Here's a patent for one. was just one part of a decade-long campaign of global expansion for Facebook. In countries such as the Philippines, the efforts have been so successful that the company is able to tout to its advertisers that its network is, for many people, the only version of the internet they know.
— Lauren Etter, Bloomberg Businessweek

What does Facebook look like without the internet?

The Facebook ecosystem is designed to manipulate at each level, to such a degree, so as to render the internet unrecognizable to its users. The internet is not what a Facebook user actually spends their time on.

The distinction is important.

What a user watches, reads, writes, thinks, or even feels is filtered through attention and awareness layers dictated by the Facebook ecosystem.

The conscious user experience at the very top-most layer, looks similar to the dessert section of the food pyramid, and may be more easily understood by this analogy: your consciousness is the treat upon which the Facebook ecosystem rewards itself after a long day of portioning the unconscious levels of your mind.

Viewed another way: the conscious user experience is but a small, extraneous serving of the time and experience values, designed and allocated for, within their triangular platform—it would really be better off without you having it.

real vs ideal

It's interesting. The Facebook user pays their ISP for broadband access to the internet, then logs into Facebook and Facebook surreptitiously convinces the user to relinquish their internet access over to Facebook.

Moreover, the relinquishing-internet-access conundrum spreads outward from its portal network to the minority of people not even registered as users, due to the ingenuity and scope of Facebook (old but relevant news).

And in the case of the Philippines, this void that Facebook sells its users as the internet, starts to become the internet, leading to the inability of citizen's to defend and protect the freedom, value, and knowledge the internet would otherwise gift to them. They cannot defend that which they cannot see is there.

Facebook has successfully changed the Philippines' perception of the internet.

This degree of omnipresence, manipulation, and control falls in line with the words from one of the web's founders and it goes similarly, if not more so, for Google.

“We don’t need you to type at all,” Schmidt told The Atlantic in 2010. “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.”
— Eric Schmidt, former Google CEO, former Alphabet Executive Chairman, The Atlantic

It is the worst case scenario.

China's citizens have lost control over their perceptions. In an experiment straight out of a Black Mirror episode, China has created a control system that exhibits and combines the attention-conscious design exploits from social media companies with the socioeconomic-imprisonment from a systematized 3-digit quantification of the worthiness of your life.

China plans to make the social credit system mandatory in 2020.

[Zhima Credit] will ensure that the bad people in society don’t have a place to go, while good people can move freely and without obstruction.
— Lucy Peng, Zhima Credit CEO, Wired

And the consequences of perception don't end with what house or friend you can have.

Chinese firm, SenseTime, has created a surveillance system incorporating neural networks and deep learning algorithms to help China create a total surveillance state in Xinjiang.

It's a method of policing that relies on both the collection and surveillance of data, and how others perceive you.

If there's a photo of you on the internet, then the system will be capable of identifying you whenever you walk past a networked camera and perform the necessary actions based on whatever tag is input next to your image, i.e. criminal, activist, or queer.

And if there isn't a photo of you already online, then all it takes is for someone to upload one into the system's database, complete with perceived tags, and the system will find and identify you and alert the corresponding authority as you walk past.

And U.S. technology companies are lining up to jump in.

One always hopes technology of this scope will be used benevolently—to track and find missing persons and children. However, it is already being used malevolently. And soon it will even do pre-crime—perception mutated into fate.

And much like how our brains create natural perceptions based on information (data) we collect from people, places, and things, China's surveillance and social credit technologies require a tremendous amount of data.

Perhaps strategies for increasing their data load, via encroachment into the data mines and cultures of foreign companies, would begin something like this:

We are busy ignoring the framework, as it is built around us, that will facilitate a dystopian future underestimated by even our most revered science fiction authors.

And the alarm echoes that of the climate, for there will come a point in which we arrive at this forewarned future, when the ability for change will be all but a digital index on the internet archive of a once perceived fate.

So what can you do?

Read George Soros' speech and then demand better from companies, governments, and elected representatives with your data, your money, and your words.

For more on what machine vision algorithms are making possible, read this. For another reason why you would want to protect your children from social networks, don't just take Apple CEO Tim Cook's advice, read about how ML and DL allow individuals to make videos with anyone's face.

Help protect your image and video data from being co-opted by others by trying the Data Detox.

Visit Panopticlick to test how well your web browser protects against tracking. Also, check out Data Selfie to catch a glimpse of how your data is being tracked across Facebook.

And if you're looking for an alternative to the convenient Google ecosystem... you won't find it yet—but here's a decent place to start.

The worth of a State, in the long run, is the worth of the individuals composing it; and a State which postpones the interests of their mental expansion and elevation to a little more of administrative skill, or of that semblance of it which practice gives, in the details of business; a State which dwarfs its men, in order that they may be more docile instruments in its hands even for beneficial purposes—will find that with small men no great thing can really be accomplished; and that the perfection of machinery to which it has sacrificed everything will in the end avail it nothing, for want of the vital power which, in order that the machine might work more smoothly, it has preferred to banish.
John Stuart Mill




¹ Kandel, Eric. "Thinking Animal." In The Earth and I, Taschen.