Data Security

Are Tech Companies Responsible for All User Information?

Share

By Katherine Teitler

The families of five terrorist attack victims filed a lawsuit in U.S. District Court on Monday. The defendant: Facebook. The families, claiming that the social media giant enabled Palestinian militants to carry out deadly attacks in Israel, are suing for more than $1 billion, calling into question the responsibility of technology companies when it comes to security.

This is a different quandary than that of IoT; internet-connected things monitoring heart rates and blood pressure or even door locks and spinning turbines have the ability to cause physical harm if tampered with. Facebook, though, can’t affect software, hardware, or devices and cause things to implode or explode; it is a bulletin board, for all intents and purposes, for people all across the world to post thoughts/ideas/intentions/pictures of food/updates/requests/success on Pokémon Go. Of course, as a social networking site, Facebook does have a social responsibility to reject, refuse, and report questionable content. The company posts very clear Community Standards, which include terms of use for everything from reporting direct threats to dangerous organizations to hate speech, criminal activity, and violence. The company, which has not commented directly on the lawsuit, said it “doesn’t want violent messages on its website.”

Weeding out the badness

Back in 2013 Facebook issued that it would “do more” to disallow objectionable content but also admitted that much work needed to be done when it came to flagging and/or removing content contrary to its standards. At present, the way most content is removed is though user policing rather than a dedicated effort by Facebook employees. Freedom of speech is a right afforded to U.S. citizens (the victims named in the aforementioned lawsuit), but where is the line? What role do tech companies play when it comes to free speech, and how responsible are social media companies for users’ actions?

There’s more than one side to this debate. On the one hand, technology companies are (or should be) responsible for the security and privacy of customer data when that data is given to them as part of a business dealing or transaction. When you get down to it, technology companies aren’t the only ones responsible for security and privacy of data. In this regard, all companies are tech companies; it’s incumbent upon data holders (businesses) to retain skilled staff who can ensure data doesn’t go flying out the window due to negligence or ineptitude.

Software and hardware companies have an even greater responsibility; non-tech companies rely on software/hardware providers to develop secure products and services upon which they, in turn, can rely to provide a safe secure environment for their customers’, partners’, and employee data. If the hardware/software itself isn’t capable of reasonable security, the foundation is shaky and everything built on top—whether it’s an Amazon Prime purchase or health records—is significantly more difficult (in some cases, impossible) to (fully) secure. Companies of every ilk, regardless of industry sector, understand this responsibility, and one would be hard-pressed to find a legitimate business that does not display its terms of use and privacy policies listed right on the homepage of its website. Plenty of class action lawsuits have occurred because businesses have failed (either through negligence or because it’s just so damn hard) to secure consumers’ data. There were repercussions, and human life was not lost.

How far is too far?

But now we’re talking about death and human suffering. This gets tricky because it’s not only lives on the line; it’s emotion and sentiment and morals and tightly held beliefs. What reasonable human being would not want to stop a terrorist act? Who, but the terrorists themselves, would say it’s OK to allow them to openly plan an attack? The goofy saying, “If you see something, say something” should always apply when violence, hatred, and threats are spewed. It’s everyone’s responsibility, but it’s not an information security issue. Facebook, Apple, and others have stood up before the Department of Justice vowing to give users the tools to protect themselves. This doesn’t mean these tech companies are shirking responsibility. In fact, it’s the opposite; most individuals and plenty of non-tech companies can’t provide the encryption or properly implement firewalls, IDS/IPS, or content filtering necessary to keep digital attacks at bay, so tech companies are stepping up to the plate and saying, “We’ll help.”

By not responding to the lawsuit, Facebook is in no way saying that the terrorists had any right to do what they did. Nor are they condoning use of Facebook as a platform for planning unspeakable acts such as the ones carried out. The Community Standards are very clear. When Facebook is eventually brought to court, it will fight the claims and use all resources necessary to be cleared of any wrongdoing. As a tech company providing an open forum for people all over the world to interact, its responsibility is to present a digitally secure environment, one where users can expect a certain level of privacy and from which information kept private cannot be accessed by prying eyes. Even Facebook’s. Facebook understands this, and even though over the years Facebook’s information security practices have been called into question, the company is seemingly taking greater steps to allow users to connect privately and securely.

Censorship vs. security

Should tech companies play a role in free speech? Are social media companies responsible for users’ actions? This is a censorship issue and not a technological one. Blaming a social media bulletin board for providing a bulletin board, then requiring it to not only keep users’ interactions secure and private and on “the up and up” is going too far. Information security is a massive battle, one which most legitimate businesses are losing. U.S. citizens’ right to privacy is difficult to protect as it as, much less adding a layer of censorship on top. The families of the victims have a right to be angry, distraught, and frustrated; these feelings, though, are misplaced. Let’s not start asking tech companies, in particular, social media companies, to now also police the digital communications we fight so hard to ask them to keep private and secure.

 

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.