Show of hands, who was surprised to learn that Facebook and those associated with the social media company – particularly app developers – collected data, and lots of it, from user accounts? Anyone? Anyone?
Social media is a rich source of information and information is valuable currency to the likes of Facebook, Twitter, Google and really any company doing business in the digital age. From the data users provide to open and build their accounts to the quizzes they take – what figure in history do you most closely resemble, if you were a dog, what kind would you be – to the clicks, the “likes” and the “lols” there's a lot of data to be had online. And, for the most part, it's freely given by those caught up in the “social media whirl” who willingly turn a blind eye or ignore the niggling feeling that the platforms they use know a little too much about them and share that knowledge a little too freely.
In a 2016 internal memo aptly titled “The Ugly,” Facebook Vice President Andrew Bosworth justified the company's data collection practices as germane to its over-arching goal of connecting people and as fuel for its skyrocketing growth.
“The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good,” the “Boz,” wrote.
But when it surfaced in March that Cambridge Analytica, a data analytics firm used by both the Trump and Brexit Leave campaigns, violated Facebook policies by collecting the personal data from accounts of 50 million Americans (since revised to 87 million) without their permission, the tone of that memo seemed darker still and clarified the company's willingness to sacrifice user privacy in the pursuit of “connectedness” and, yes, profit.
And then soon after, Facebook seemed to add credence to claims that it had a data collection and transparency problem when it admitted to recording the call logs of Android users employing Messenger, though it defended “call and text history logging” as a “part of an opt-in feature for people using Messenger or Facebook Lite on Android.”
Recording logs helps users “find and stay connected with the people [they] care about, and provides [them] with a better experience across Facebook,” the company says.
The tale of Cambridge Analytica in all its sordid intrigue – the company seemed to be a darling of political campaigns and its CEO bragged about its dark actions – brought a mea culpa from Facebook CEO Mark Zuckerberg. “We have a responsibility to protect your data, and if we can't then we don't deserve to serve you,” writes in a post.
Zuckerberg also pledged to regain the public trust and the company produced a flurry of upgrades and moves meant to prove or at least show that Facebook was serious about improving privacy and self-policing. And, it just might have done what politicians have failed to do and what technology companies have resisted – compel privacy regulation, legislation and oversight of social media.
“I am not sure we shouldn't be regulated,” Zuckerberg now says.
That's a sentiment echoed by Apple CEO Tim Cook, a proponent of self-regulation, who recently said on MSNBC that the Facebook debacle is “so big” that it just may well be time for “well-crafted regulation.” Sen. John Kennedy, R-La., expressed concern on Face the Nation over the “privacy issue” exposed by Facebook, saying that it and “the propagandist issue” might be “too big for Facebook to fix” by itself.
“I don't want to regulate them half to death,” he says. “But we have a problem.”
A day of reckoning
As the story goes, an app developed by Cambridge University professor Aleksandr Kogan called thisisyourdigitallife harvested data for the firm, owned in part by hedge fund operator Robert Mercer and once led by former White House adviser Steve Bannon. About 270,000 Facebook users signed up to take a paid personality test through the app. Their data and that of their friends, counting in the millions, was passed along to Cambridge Analytica.
“We exploited Facebook to harvest millions of people's profiles. And built models to exploit what we knew about them and target their inner demons,” whistleblower Christopher Wylie, who worked closely with Kogan, told the Observer. “That was the basis the entire company was built on.”
By passing along information from users who had not given permission to a third party and then also not properly deleting that data, Facebook says Kogan and Cambridge Analytica broke its rules.
“Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules,” Facebook Vice President and Deputy General Counsel Paul Grewal says in a post announcing the suspension of Cambridge Analytica, its parent Strategic Communication Laboratories (SCL), Kogan and Wylie. “By passing information on to a third party, including SCL/Cambridge Analytica and Christopher Wylie of Eunoia Technologies, he violated our platform policies.”
When Facebook first learned of the violation back in 2015, it removed Kogan's app “and demanded certifications from Kogan and all parties he had given data to that the information had been destroyed,” Grewal writes. “Cambridge Analytica, Kogan and Wylie all certified to us that they destroyed the data.”
But apparently, that was not the case. Evgeny Chereshnev, CEO at Biolink.Tech, says, though, “It doesn't matter what this data leakage would have proven or not proven. The point is that there was always the opportunity, and possibility, that certain data would be extracted from Facebook by hackers or third-party providers that we, the users, were not aware of.”
While “it has been said that it's data taken from Facebook without the users' consent,” Chereshnev calls the claim “both true and not true.”
By reading “the license agreement, when you sign up to Facebook, you would understand that you have absolutely no rights when it comes to your data; your information, what you post and how information is gathered about you. Facebook can analyze and use this data any way it want,” he says.
Lawmakers like Sen. Kirsten Gillibrand, D-N.Y., immediately took Facebook to task and called for greater transparency. Famed whistleblower Edward Snowden, currently in exile in Russia, fired off a tweet saying that “businesses that make money by collecting and selling detailed records of private lives were once plainly described as ‘surveillance companies.' He calls “their rebranding as ‘social media'” entities “the most successful deception since the Department of War.”
The privacy debacle was all the more intriguing given Cambridge Analytica's prominent and influential clientele – and hints at shady practices, succinctly detailed in a series of undercover videos of the company's CEO Andrew Nix boasting using sex workers and setting up stings to snare politicians.
Cambridge Analytica has been widely credited with helping the Trump campaign pull off an election victory over rival Hillary Clinton. And indeed, the president's son-in-law and adviser Jared Kushner praised the operation last year in an interview with Forbes.
“We found that Facebook and digital targeting were the most effective ways to reach the audiences. After the primary, we started ramping up because we knew that doing a national campaign is different than doing a primary campaign. That was when we formalized the system because we had to ramp up for digital fundraising,” Kushner said. “We brought in Cambridge Analytica. I called some of my friends from Silicon Valley who were some of the best digital marketers in the world. And I asked them how to scale this stuff,” he said. “Doing it state by state is not that hard. But scaling is a very, very hard thing. They gave me a lot of their subcontractors and I built in Austin a data hub that would complement the RNC's data hub. We had about 100 people in that office, which nobody knew about, until towards the end. We used that as the nerve center that drove a lot of the deployment of our ground game resources.”
Rep. Ted Lieu, D-Calif., points to Kushner's role and calls the story “disturbing.”
The consequences to Facebook for its role in the messy affair – and the businesses practices that tilled fertile ground for data privacy abuse – have been wide-ranging, causing its stocks to dip, tarnishing its reputation and even costing it its CISO, Alex Stamos, who leaves his post in August, reportedly because of differences with other managers over the issue of transparency.
After taking a pounding in the wake of the Cambridge Analytica fiasco for lax data collection policies the company vowed to win back trust.
The social media giant's first step was banishing the major players associated with the data analytics firm and raised the strength visibility of its privacy tools.
“We've redesigned our entire settings menu on mobile devices from top to bottom to make things easier to find. Instead of having settings spread across nearly 20 different screens, they're now accessible from a single place,” Facebook Vice President and Chief Privacy Officer (CPO) Erin Egan and Vice President and Deputy General Counsel Ashlie Beringer write. “We've also cleaned up outdated settings so it's clear what information can and can't be shared with apps.”
Noting that many of the changes had been in the works for a while, the two say the company would make privacy and security policies easier to find with a new Privacy Shortcuts menu that would take users to relevant information in just a few clicks. From the menu, the post said, users can:
Make their accounts more secure by adding layers of protection, including two-factor authentication. Users will be notified if someone logs in to an account from an unrecognized device.
Control personal information by reviewing information and items that have been shared and deleted.
Control the ads they see by managing the information Facebook uses to show users ads using Ad preferences.
Manage who sees posts and profile information. The social media firm stresses that data belongs to the user.
Facebook also says it will offer a set of tools under “Access Your Information” that will let their users access, manage and delete comments, posts and other data.
“We're also making it easier to download the data you've shared with Facebook,” Egan and Beringer write.
In the future, the company will change its terms of service so that users can understand how data is collected and used.
“We'll also update our data policy to better spell out what data we collect and how we use it,” the post says, noting that the company has worked with privacy experts, regulators and legislators to come up with tools and updates to improve privacy and avoid a scenario like the one that unfolded with Cambridge Analytica. “These updates are about transparency – not about gaining new rights to collect, use, or share data.”
The company also expanded its bug bounty program to include data misuse by app developers, which security pros say could set a trend in the industry.
“Facebook's bug bounty program will expand so that people can also report to us if they find misuses of data by app developers,” the company wrote in a blog.
Craig Young, computer security researcher for Tripwire's Vulnerability and Exposure Research Team (VERT), says the “move by Facebook really makes a lot of sense” to him. “By expanding their bounty program to include data misuse by app developers, Facebook may have found a way to mobilize their community to self-police,” he says. “It will be interesting to see if this if spurs new bug bounty participation including people less technical than the typical bug hunter.”
Young says the social media company's move, “could be the start of a trend toward more policy-oriented bug bounties from social media platforms.”
Facebook also says that if it finds “developers that misused personally identifiable information (PII), we will ban them from our platform” and take it a step farther by removing “an app for misusing data, we will notify everyone who used it.”
The weight of the accusations and innuendo reportedly has prompted the beleaguered Zuckerberg to agree to testify before the Senate Judiciary Committee.
But even a public apology and a pledge to regain public trust by implementing greater privacy protections aren't enough to fend off probes by the Federal Trade Commission (FTC), the Massachusetts Attorney General, and the U.K. Office of the Information Commissioner as well as a lawsuit by the state of Illinois.
The suit claims that Facebook “not only allowed” but also “encouraged” sketchy data collection practices.
“Cambridge Analytic deliberately misled Facebook users so it could build psychological profiles of the user and their friends, and Facebook did not stop it,” Cook County State's Attorney Kimberly M. Foxx says. “This blatant deception violated Illinois law and more importantly violated the privacy of Illinois residents. Cambridge Analytica and Facebook must be held accountable for their actions.”
The suit targets the two companies, and SLC Group Limited, for a “fraudulent scheme” intended “to harvest the data of millions of American voters.”
The data analytics firm “could access [a] nearly unlimited trove of data,” allegedly private and protected by Facebook's stated policies for users and developers, by “using Facebook's existing developer tools, an open secret that was well known to developers.”
Recasting Facebook from social media company to “the largest data mining operation in existence,” the suit says the company “sought to keep developers building on its platform and provide companies with all the tools they need to influence and manipulate user behavior.”
The FTC is investigating whether Facebook violated a 2011 consent decree, which required it to obtain user consent when privacy settings were changed or risk paying $40,000 per day for each violation.
“The FTC is firmly and fully committed to using all of its tools to protect the privacy of consumers,” which includes “enforcement action against companies that fail to honor their privacy promises, including to comply with Privacy Shield, or that engage in unfair acts that cause substantial injury to consumers in violation of the FTC Act,” Tom Pahl, acting director of the FTC's Bureau of Consumer Protection, said in a March statement. “Companies who have settled previous FTC actions must also comply with FTC order provisions imposing privacy and data security requirements.”
Bart Lazar, a privacy attorney with Seyfarth Shaw, says “not that much has changed” in the two decades since he was the lead attorney defending GeoCities against the FTC's privacy action. “The issues are really the same. The last decade or so, the focus has been on security as opposed to privacy,” he says. “The Facebook situation brings to bear some very basic privacy issues, such as the clarity of privacy notices, and the importance of serious due diligence with respect to any third party or service provider to whom personal information is disclosed.”
The U.S. hasn't established baseline privacy protections but rather relies on a “hodgepodge” of state and federal laws. “The FTC has made itself out to be the sheriff in the privacy space,” Lazar says, questioning whether “the FTC enforcing its broad Section 5 authority, without real legal guidance” actually helps consumers and businesses.
Revelations about Facebook's data collection practices and its relationship with Cambridge Analytics have renewed calls for legislative action to provide that type of guidance. Tech companies – and other organizations – have long resisted additional regulations, concerned that they could kill tech innovation or be manipulated by political forces to control content. But now, there is growing support for tougher privacy laws, a la Europe's GDPR, set to go into effect this month.
The American Civil Liberties Union (ACLU) recently pushed lawmakers to press Zuckerberg on privacy when he testified before the House Energy and Commerce Committee saying it “believes this is an opportunity to ensure Facebook is addressing key issues around privacy and discriminatory practices.”