Following an announcement that the U.K.’s Information Commissioner has decided to fine Facebook $664,000 for its role in the Cambridge Analytica scandal, much of the chatter has focused on the small size of the fine and how laughably easy it will be for the tech giant to shrug off.
In fact, the fine was the maximum the ICO had the power to levy. But beyond that, it’s worth reading the two reports the agency issued because the details are in fact quite shocking, even after months of controversy, and show just how profound and wide-ranging the problems with politics and data-sharing still are.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2372125,"post_type":"story","post_chan":"none","tags":"category-news-politics","ai":false,"category":"none","all_categories":"big-data,bots,business,cloud,commerce,marketing,media,security,social,","session":"A"}']“We are at a crossroads,” Information Commissioner Elizabeth Denham said in a statement. “Trust and confidence in the integrity of our democratic processes risk being disrupted because the average voter has little idea of what is going on behind the scenes. New technologies that use data analytics to micro-target people give campaign groups the ability to connect with individual voters. But this cannot be at the expense of transparency, fairness, and compliance with the law.”
The agency issued two reports yesterday:
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
- A progress report that offers details about the specific case at hand involving Cambridge Analytica.
- A second partner report called “Democracy disrupted? Personal information and political influence,” which takes a much wider view of the subject across the British political spectrum.
Let’s start with the first report.
To begin with, the fine comes as part of an ongoing investigation, not one that is finished. The ICO made the unusual decision to disclose the fine before the investigation was concluded because the case was of such national and international importance.
Meanwhile, the ICO is expanding its work and has sent letters to Britain’s major political parties warning them that they will be audited over their use of personal data.
The report says: “We have concluded that there are risks in relation to the processing of personal data by many political parties. Particular concerns include: the purchasing of marketing lists and lifestyle information from data brokers without sufficient due diligence, a lack of fair processing, and use of third-party data analytics companies with insufficient checks around consent.”
The ICO is also investigating whether a British insurance company illegally shared its customer data with the Brexit Leave.EU campaign to help the latter target voters online. In addition, the agency is looking into what it believes may have been data abuses by the Vote Leave campaign. However, at least one Remain in EU organization is also under scrutiny.
As part of this process, the ICO is scrutinizing Google, Snapchat, and Twitter, in addition to Facebook. Worth noting is that when asked about ads from Cambridge Analytica, Twitter told the ICO it had banned access to its data products and removed Cambridge Analytica’s ads because the company “determined that Cambridge Analytica operated a business model that inherently conflicted with acceptable Twitter Ads business practices.”
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2372125,"post_type":"story","post_chan":"none","tags":"category-news-politics","ai":false,"category":"none","all_categories":"big-data,bots,business,cloud,commerce,marketing,media,security,social,","session":"A"}']
Regarding the controversial personality quiz developed by Cambridge University researcher Aleksandr Kogan, the information it accessed was staggering. From people who logged in using their Facebook profile, it was able to grab their name and gender, birthdate, current city, photographs in which the users were tagged, pages that the users had liked, posts on the users’ timelines, news feed posts, friends lists, email addresses, and Facebook messages.
It also managed to collect the following from those users’ friends: public profile data, including name and gender; birth date; current city, if the friends had chosen to add this information to their profile; photographs in which the friends were tagged; and Pages that the friends had liked.
“For some of this Facebook data, estimated to involve around 30 million users, the personality test results were paired with Facebook data to seek out psychological patterns and build models,” the report says. “GSR (the company Kogan started) shared data with SCL Elections Ltd (Cambridge Analytica’s parent company) in at least four discrete disclosures. It is believed it then combined this with other sources of data, such as voter records held by SCL, to help inform targeting of individuals in key marginal states with personalised advertising during the presidential election process.”
How did Facebook react? Following an investigation by Irish officials in 2014, Facebook changed its data-sharing policies. But it also gave current app users like Kogan, via his Global Science Research company, a one-year grace period to wind things down.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2372125,"post_type":"story","post_chan":"none","tags":"category-news-politics","ai":false,"category":"none","all_categories":"big-data,bots,business,cloud,commerce,marketing,media,security,social,","session":"A"}']
“This change included a one-year grace period for many pre-existing apps, which gave them until May 2015 to comply with the new policy,” the report says. “It was during this grace period that the GSR app accessed the majority of its information.”
The ICO is continuing to investigate who Kogan shared that data with, but the report indicates it was shared widely with other parties. It is also expanding its scrutiny of Cambridge University and its Psychometric Centre, the center where Kogan and other researchers worked.
The Psychometric Centre was set up in 2005, just as social media was coming into its own. And it proved to be perfect timing, as the institute sought to develop psychological assessment tools and services for the digital era. The ICO believes this work may have run amok, far beyond just Kogan’s contribution, and is therefore auditing the center’s data use.
“As our investigation has broadened with examination of Dr Kogan’s actions and his use of Cambridge University credentials to lend support to his actions, we have engaged with the university at senior level,” the report says. “Our engagement with the university (and others in the U.K. and abroad) has identified that there are some common issues to tackle.”
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":2372125,"post_type":"story","post_chan":"none","tags":"category-news-politics","ai":false,"category":"none","all_categories":"big-data,bots,business,cloud,commerce,marketing,media,security,social,","session":"A"}']
In addition, the ICO is now investing a number of third-party data brokers and credit agencies who appeared to have also unlawfully sold user data, information that was then mixed and matched with other digital data, such as the Facebook data, to further refine and target messages.
It should be said that the ICO’s work has been hampered by cross-jurisdictional issues. Facebook, though it has cooperated, insists that it is governed by Irish data rules, and not the ICO, because its European headquarters is in Dublin. Meanwhile, the ICO has been trying to investigate the web of affiliated corporations that passed this data around, including AggregateIQ in Canada, which also says it is not within the ICO’s jurisdiction, and the University of Mississippi, which may have also received some of the data in question.
Turning to the second report, the authors attempt to map out more broadly and in greater detail just how political parties gather and use personal data:
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":2372125,"post_type":"story","post_chan":"none","tags":"category-news-politics","ai":false,"category":"none","all_categories":"big-data,bots,business,cloud,commerce,marketing,media,security,social,","session":"A"}']
This data is blended to generate sophisticated profiles of individuals, which allows for unprecedented micro-targeting of messages. While the absolute numbers can appear small, the report says, in era of close, hard-fought elections, such campaigns can be enough to make the difference.
This brings us back to Facebook. The company offers what it calls its Custom Audience tool for political parties (and any marketer really). U.K.. political parties love this tool, and in 2017 spent 3 times as much on that as they did buying ads on Google. Custom Audience allows a marketer to create a specific advertising target by using “existing data about an individual possessed by that organisation,” which is then matched with Facebook data.
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":2372125,"post_type":"story","post_chan":"none","tags":"category-news-politics","ai":false,"category":"none","all_categories":"big-data,bots,business,cloud,commerce,marketing,media,security,social,","session":"A"}']
The report says: “The Custom Audience service allows an advertiser to target adverts to individuals via multiple methods, the most common being to upload a list of email addresses, phone numbers, or user IDs that they and the advertiser already possess to Facebook. If Facebook is able to match information in its database with that uploaded by the advertiser, then those individuals may see an advert from that advertiser the next time they log into their account.”
Facebook claims it is innocent in all of this because it doesn’t see the actual data. But users who are targeted never know they are in this custom audience, or that information gathered from outside Facebook is now being used on Facebook to target them.
From there, Facebook also offers a “Partner Categories” service. This lets advertisers pull in other sources of third-party information from companies such as Acxiom, Experian, and Oracle Data Cloud on top of all the data they’ve already shoved into Facebook to further refine and target their messaging.
“Whilst users were informed that their data would be used for commercial advertising, it was not clear that political advertising would take place on the platform,” the report says. “The ICO also found that despite a significant amount of privacy information and controls being made available, overall they did not effectively inform the users about the likely uses of their personal information …The ICO has concluded that Facebook has not been sufficiently transparent to enable users to understand how and why they might be targeted by a political party or campaign.”
The ICO is proposing a number of reforms to deal with the slippery nature of all this data sharing in politics. But as the reports make clear, we have moved into an era full of gray areas and morally dubious arrangements between politicians and marketers and digital platforms.
Each time we go online, we are unwittingly consenting to share data that is being used to manipulate us. And even after the past two years of controversy and revelations, most of us only have the slightest concept of the size and power of the machine that has been built behind the scenes to take advantage of the vast treasure chest of personal information we continue to carelessly place online without the least hesitation.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More