A U.K. parliament report condemning a multitude of actions by Facebook called for closer regulation of the social media giant stating the company often ignored its own privacy policy and its executives were less than forthcoming when testifying before a parliament committee.
The House of Commons Digital, Culture, Media and Sport Committee’s Disinformation and ‘fake news’: Final Report called out several companies, but specifically targeted Facebook for its actions during recent election cycles and its behavior in dealing with its user’s privacy in order to benefit certain clients. An interim version of this report was issued last summer, but was updated and reissued because the Committee said its recommendations were not acted upon.
The Committee claimed internal Facebook emails generated between 2011 and 2015 showed the company intentionally ignored its own privacy policies in order to benefit a preferred app developer while damaging others, such as Six4Three.
The report found "evidence to indicate that the company was willing to: override its users' privacy settings in order to transfer data to some app developers; to charge high prices in advertising to some developers, for the exchange of data, and starve some developers (such as Six4Three) of that data, contributing to them losing their business.”
From this evidence, the Ministers of Parliament concluded Facebook “intentionally and knowingly violated both data privacy and anti-competition laws.”
Based on this information the committee recommended the Information Commissioner's Office conduct an investigation into how Facebook uses its user’s data and conduct a comprehensive audit of the advertising market on social media and investigate whether Facebook has been involved in anti-competitive practices.
The Committee also expressed its dismay that it had to repeatedly request from Facebook information on Russian activity on the site during the 2016 U.S. election and that the senior Facebook executives, specifically Simon Milner, policy director U.K., Middle East and Africa, at Facebook, who testified were not truthful.
“Given the information contained in the New York Times article and the information we have received from Six4Three, we believe that Facebook knew that there was evidence of overseas interference and that Mr Milner misled us when he gave evidence in February 2018,” the report said.
The report recommended the establishment of clear legal liabilities for “tech companies to act against harmful or illegal content on their sites, and the report calls for a compulsory Code of Ethics defining what constitutes harmful content."
An independent regulator, it said, "should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies in breach of the code.”
Companies failing these obligations would face severe fines.