Our perceived selves are not our own. Nor, perhaps, our own perceptions.
The global workspace theory was introduced in 1980 by cognitive psychologist Bernard Baars, explaining how our brain states of attention and awareness "involve a broadcasting of unconscious information into conscious information"¹.
Soon after, French neuroscientist Stanislas Dehaene expanded upon the theory with an experiment involving flashing words and shapes on a screen, uncovering the mechanics of subliminal messaging: the message was received by the brain but not broadcast to become conscious information¹.
Dehaene achieved this by appending a shape to the screen to be flashed on either side of the desired subliminal word. This caused the study's participants' brains to process the information from the subliminal word without being consciously aware of it¹.
Understanding this, and imagining a bit further, it becomes possible to see how an advertisement, business, individual, or government could influence your own perceptions of yourself and the world around you.
If an advertising company has the technological know-how to implement ultrasonic tracking (this one, Intel patent, Tor users not safe either) why couldn't they implement a similar subliminal audio system or other more advanced subliminal systems? How would we know? Here's a patent for one.
What does Facebook look like without the internet?
The Facebook ecosystem is designed to manipulate at each level, to such a degree, so as to render the internet unrecognizable to its users. The internet is not what a Facebook user actually spends their time on.
The distinction is important.
What a user watches, reads, writes, thinks, or even feels is filtered through attention and awareness layers dictated by the Facebook ecosystem.
The conscious user experience at the very top-most layer, looks similar to the dessert section of the food pyramid, and may be more easily understood by this analogy: your consciousness is the treat upon which the Facebook ecosystem rewards itself after a long day of portioning the unconscious levels of your mind.
Viewed another way: the conscious user experience is but a small, extraneous serving of the time and experience values, designed and allocated for, within their triangular platform—it would really be better off without you having it.
It's interesting. The Facebook user pays their ISP for broadband access to the internet, then logs into Facebook and Facebook surreptitiously convinces the user to relinquish their internet access over to Facebook.
Moreover, the relinquishing-internet-access conundrum spreads outward from its portal network to the minority of people not even registered as users, due to the ingenuity and scope of Facebook (old but relevant news).
And in the case of the Philippines, this void that Facebook sells its users as the internet, starts to become the internet, leading to the inability of citizen's to defend and protect the freedom, value, and knowledge the internet would otherwise gift to them. They cannot defend that which they cannot see is there.
Facebook has successfully changed the Philippines' perception of the internet.
It is the worst case scenario.
China's citizens have lost control over their perceptions. In an experiment straight out of a Black Mirror episode, China has created a control system that exhibits and combines the attention-conscious design exploits from social media companies with the socioeconomic-imprisonment from a systematized 3-digit quantification of the worthiness of your life.
China plans to make the social credit system mandatory in 2020.
And the consequences of perception don't end with what house or friend you can have.
It's a method of policing that relies on both the collection and surveillance of data, and how others perceive you.
If there's a photo of you on the internet, then the system will be capable of identifying you whenever you walk past a networked camera and perform the necessary actions based on whatever tag is input next to your image, i.e. criminal, activist, or queer.
And if there isn't a photo of you already online, then all it takes is for someone to upload one into the system's database, complete with perceived tags, and the system will find and identify you and alert the corresponding authority as you walk past.
And U.S. technology companies are lining up to jump in.
One always hopes technology of this scope will be used benevolently—to track and find missing persons and children. However, it is already being used malevolently. And soon it will even do pre-crime—perception mutated into fate.
And much like how our brains create natural perceptions based on information (data) we collect from people, places, and things, China's surveillance and social credit technologies require a tremendous amount of data.
Perhaps strategies for increasing their data load, via encroachment into the data mines and cultures of foreign companies, would begin something like this:
- Apple sells out to China and wants to keep our medical info on their servers.
- Mark Zuckerberg stoops to great depths to acquire business for Facebook in China (and the Philippines)
- Foxconn comes to Wisconsin
And the alarm echoes that of the climate, for there will come a point in which we arrive at this forewarned future, when the ability for change will be all but a digital index on the internet archive of a once perceived fate.
So what can you do?
Read George Soros' speech and then demand better from companies, governments, and elected representatives with your data, your money, and your words.
For more on what machine vision algorithms are making possible, read this. For another reason why you would want to protect your children from social networks, don't just take Apple CEO Tim Cook's advice, read about how ML and DL allow individuals to make videos with anyone's face.
Help protect your image and video data from being co-opted by others by trying the Data Detox.
And if you're looking for an alternative to the convenient Google ecosystem... you won't find it yet—but here's a decent place to start.