Why Are You So Bothered About How Facebook Treated Your Data?
Data belonging to Facebook users was misused by a third party company, and the world is in an uproar about the situation, threatening to leave the 2.2 billion users-platform. Whoa! Hold your horses! What exactly are we upset about here? What didn’t we already know?
So, what happened? Well, years ago, a researcher from Cambridge Analytica created a personality test for Facebook users. Due to the way the platform was built at the time, it wasn't just the 270,000 people who took the test that the company collected data on, but their friends too - millions and millions. That data was later used by the company to help Donald Trump's campaign for the Presidential elections. In short, the British company used the information to create user profiles, and then feed these tens of millions of people with manipulative ads to push them towards voting for Trump. Information that has since appeared in the media shows that the same thing happened in other countries, too, not just the United States.
Ok, so what are we upset about? The fact that we were manipulated to vote one way or another? The fact that Facebook didn't properly secure our data?
If it's the first one, well, we get it. But, the thing is - and it's been all over the news in recent years - the Internet is full of trolls that seek to do just that. The US elections were a sort of full-blown trial from the Russian trolls to see if they can actually influence such a big event - and it worked. Tens of thousands of fake accounts, dozens of websites full of fake news, millions and millions of shares of posts that contained inflammatory fake data to manipulate people. The same thing happened in numerous other countries, including France, and the UK during its Brexit vote. Therefore, it's not really something new.
If it's the latter, well, we have news for you! Regardless if Facebook's platform allowed for third-party companies to pull information on users and their non-consenting friends, your data is still being used - by the social network itself. It's not just Facebook either. It's Google. It's Amazon. It's Twitter. It's Alibaba. It's pretty much any company behind a tool you use online that you're not paying for. In short, if you're not paying for something with money, you're paying with your data - and that data is much more valuable than you think it is.
Facebook collects your name, gender, date of birth, email, and mobile number before you even start using the platform. It then gathers up everything it can on you - physical location, IP address, what you do on Facebook, what you look at, what ads you click on, what posts you like, what pages you appreciate, which friends you interact with. It then carries on to any site and app you use while logged onto Facebook, including through cookies. The latest info that’s putting gasoline on the fire on which Facebook currently burns is that the company also knows who’s on your contact list, who you call, how many times, and so on, if you sync your phone’s contacts. The list just goes on and on. The secondary information it has on you is made up of deductions their systems made, like your political affiliation, religious beliefs, financial situation, and so on. It never ends.
Facebook knows your cyber secrets
These companies know all your cyber secrets, even if you don't consciously share them publicly. Defined by cyber secret futurist Arthur Keleti in the book The Imperfect Secret, cyber secrets are things that you wouldn't normally share with others, and they are dependent on the context - the social rules we live with, the religious creeds we believe in, the laws of the state we are located in, and so on. They can be white, gray, or black, with the first having the capacity to be mildly embarrassing, while the latter might very well have palpable consequences. You really don't want these things to be shared publicly.
The thing is, we know that Facebook has a lot of data on us, we know that Google has the same, and all others; but somehow, because these records are not physical in the sense a 100-page folder is, we don't believe they are as important. There's a glitch in our brains that makes it difficult to be properly aware of the magnitude of our data and we tend to become enraged when we discover some company or another, in this case - Facebook, has done something to put that data in danger.
In a study published by Stanford University under the name of “Cognitive Dissonance Reduction as Constraint Satisfaction,” the authors explain that cognitive dissonance reduction means that in order to reduce the dissonance between two things - in our situation the presence of the data and its importance, we decrease the importance of the dissonant relations. The notion isn’t new by any means, as it was invented by social psychologist Leon Festiger back in the 50s, but it still very much applies to this day.
So, the more data we are conscious Facebook has on us, the less importance we give to it. In the end, when something big happens, like the current scandal, the importance of that massive stockpile of data suddenly surges and we are more aware of it, which makes us angry. We trusted you! But why did we?
Do you know how Facebook makes its money? By selling that very data, you trusted them with. Do you know how Google makes its money? The same way. Facebook's total revenue for 2017 went up 47% to $40.6 billion, $39.9 of which came from ads. Google's total revenue for 2017 was up 24% to $32.3 billion. This is money they made mostly by selling data. Your data!
We know this! We hear about these numbers. We sign on the dotted line the Terms of Service and Privacy Policies that no one really reads, just wanting to use whatever service it is. For free. Well, in exchange for your data, really.
Controlled, not in control
We get the misguided impression that we have some semblance of control over what happens with our data through those privacy and security features these companies offer us. On Facebook, for instance, those privacy tools protect your information from other users, maybe even some advertisers if you know where to go to block them - but they don't keep the data away from Facebook. The same for Google, Twitter, and pretty much any other company out there.
There's also the sad fact that no one is on our side. Sure, Mark Zuckerberg might apologize and admit Facebook failed its users, but every little system that works behind the scenes - AIs and other software - is there to help the company process more data, faster.
This is the world we have built for ourselves. For the past few decades, we have accepted this situation that has been getting worse and worse. And yes, out of the 2.2 billion people that are on Facebook, how many are really going to follow through and quit the platform after this scandal? Not many. The reality is that while we like to throw our fists in the air and curse at whoever broke our trust, we then just forget that we were ever in danger. Until the next time.
It's pretty much how it's been happening in the United States with the gun control debate. The issue comes up every few weeks (or more frequently nowadays) when there's a mass shooting. Everyone sends thoughts and prayers as if that will fix anything, blames mental illness, and the Feds for not catching it sooner, blames schoolmates for not being nicer to the attacker, and so on. And then, when someone brings up gun control laws, another asks that they don't politicize the tragedy, while others start screaming that a more rigorous background check somehow equals taking away all guns. Back to square one. Bring it all up again after the next school shooting.
We only seem to remember the importance of the things around us when a crisis hits. We have to be aware of the value of our data at all times and make a decision: either we consciously accept that this is how our data is used and stop complaining every time something happens, or we stop using these tools that rely on selling our data for profit. We can't have it both ways.