Unless you’ve been hiding under a rock or adopted the hermit lifestyle (go you!) this past week, you’ll know about the recent scandal involving Facebook and Cambridge Analytica.
Many of you will, like us, not be all that surprised, really. There’s been little doubt (if only little proof publicly available), of how questionable such in-depth data insights could be obtained and used by the analytics firm following the surprise success of the Trump campaign in 2016.
Things haven’t been looking great for Facebook for some time now. With allegations of promoting fake news, studies of its addictiveness and their part in Tristan Harris’ race to the bottom of the brain stem, its effects on anxiety and depression in users and now, what must be a nightmare for Zuckerberg, data privacy issues amidst the Cambridge Analytica scandal. Just this week, the company lost 8% of its market value following the revelations.
Facebook’s most valuable asset is your data. It’s an advertising platform (wolf) that doubles as a social platform (sheep’s clothing). Advertisers, marketers, companies of all sizes, pay big money to use big data collected through the platform and target segments of people. Essentially, this is what Cambridge Analytica has done in a much more detrimental way than pushing out the latest offer on local pizza.
And just think, what have you shared on Facebook over the years? Photos of your face – from all angles (Facebook has the largest data set for facial recognition), pictures of your family, friends, and pets. Probably where you live, check-in data on nights out and restaurants, holiday destinations, and to the dismay of your virtual friends, what you’re having for tea. All of this data you have handed over for free is giving away a blueprint of your very being – I’m especially looking at you overshare-ers, you know who you are!
But truth is, you don’t even have to post much to be giving away insights on yourself and it doesn’t take much for a clever machine learning algorithm to decipher your political leanings from the type of posts you like and share, groups you like or the things you may comment on.
With 2.2 billion daily active users, and 14 years in operation, Facebook is an Aladdin’s cave of data. But with great power comes great responsibility, and tech giants like Facebook have a social responsibility to protect the human beings behind the data points. They need to be in it for users and clients in equal measures to ensure a fair service is provided for everyone, but recent events show an imbalanced scale where users are not protected to the degree you’d expect.
This is where Cambridge Analytica and researcher Aleksandr Kogan come in. Kogan had permission from Facebook to use data for research by using an app to gather these insights. Reportedly, users were paid to download an app to fill out personality survey, which let Kogan capture users underlying data (their profiles, history, actions taken on the platform etc). The ID numbers from the coding behind their profiles were put into an algorithm that predicted how these people were likely to vote. But, this didn’t just take the respondents data who had agreed to hand it over, but their friends too, crawling entire social networks. Whistleblower Chris Wiley estimates that this would be on average 300 people per 1 respondent. He goes on to tell Channel 4 that around 50 million Facebook records over the span over a couple of months were gathered using this method.
We’ve moved away from the rather obvious commercial propaganda of the World Wars (your country needs YOU), and instead, social media is allowing companies like Cambridge Analytica the stage to insidiously influence the political landscape around the world. Most notoriously, the 2016 Presidential Election by using the data harvested from the millions of profiles, using bespoke adverts and micro-targeting swing voters.
Cambridge Analytica has supposedly influenced political campaigns in Nigeria, Kenya, Czech Republic, and India to name a few. Potentially Brexit, which given the political leanings of the firm, and surprise result it wouldn’t be much of a revelation at this point.
An algorithm can easily tell us more about an individual by learning from curated behavioural data as seen on social media. To persuade voters using facts is fair, but using something as personal as psychological traits and exploiting individuals cognitive processing using their ‘hopes and fears’ brings up many ethical questions. Regardless of whether this information is volunteered online by yourself, should it really be used to have such widespread impact on something like an election?
Channel 4’s secret filming of Cambridge Analytica below reveals some very uncomfortable truths about this analytics firm. In all honesty, you should probably just watch this for yourself because anything less than a full transcript wouldn’t do it justice.
Watched it? Great!
And whilst Cambridge Analytica are the ones who have manipulated the data sets in order to influence political goals, Facebook essentially empowered the company to do so, albeit unwittingly, through lax data control internally.
Why did Facebook allow an app to have such permissions in place? Why did the harvesting of millions of files not get questioned? Could an internal investigation not have revealed it sooner? And whilst it’s not a straightforward data breach as such, as there was permission/consent to a degree given by users, Kogan violated Facebook policies by passing this data to Cambridge Analytica according to Facebook. It’s also been revealed that some of the data may still exist at the time of discovery (although probably not now) even though they had been told it had been deleted.
What we post on social media is being used in ways that many would never imagine, and unfortunately, many probably don’t even care. In light of the recent scandal, we need to really rethink how we use social media how it integrates into our lives. And businesses need to rethink how they handle data. It will be interesting to see how this story unfolds, the repercussions for Cambridge Analytica, and moreover the changes Facebook will have to implement to regain public trust.