Don't Let a Data Debacle Like Facebook's Happen at Your Company In this case, it's not a good idea to "follow" Facebook.
By David Gorbet Edited by Dan Bova
Opinions expressed by Entrepreneur contributors are their own.
Everyone knows that losing control of customer data is bad. So, why does it keep happening? It seems every week we're hearing about a new data debacle affecting millions of people. This week, Facebook and Cambridge Analytica are in the news, but even as their drama plays out we're hearing about a new breach of 880,000 payment cards from online travel site Orbitz. Last month alone we saw 39 health data breaches impacting over 350,000 patient records. That same month, we heard about a breach of 112,000 FedEx customer records. The list goes on.
Facebook would like us to think of their recent breach as less significant, because it wasn't strictly-speaking a security breach. That's cold comfort to the 50 million Facebook users whose data was inappropriately exploited. As I emphasized last May in my governance-focused keynote at our user conference, it's time to take a step back and take a hard look at how we govern the data that is entrusted to us. Here are five lessons from the Facebook and Cambridge Analytica debacle that every organization should heed.
Related: The Co-Founder of Whatsapp, Which Is Owned by Facebook, Tweets '#deleteFacebook'
Governance is about more than security.
We're constantly hearing about security breaches, and indeed the threat landscape for security breaches is getting worse. The average cost of a data breach across all industries in the United States now tops $7 million, and it's estimated that about one-third of companies globally will experience at least one material breach involving over 10,000 records in the next 24 months.
According to The New York Times, Facebook's chief security officer insists that this incident is not a breach: "The recent Cambridge Analytica stories by the NY Times and The Guardian are important and powerful, but it is incorrect to call this a "breach' under any reasonable definition of the term." Sure, this may not have been a security breach, but it was certainly a breach of policy, and a serious breach of trust. Thinking about this from the Facebook user perspective, it may actually be worse than a security breach because Cambridge Analytica didn't have to go to the effort of hacking Facebook to obtain this data.
Related: 20 Revelations From Facebook CEO Mark Zuckerberg's Apology Tour
Organizations that are entrusted with customer data have an obligation to govern it well, and that means more than just data security. It means creating thoughtful and comprehensive policies about how data is to be managed and ensuring that controls are in place to enforce and audit those policies.
Which brings me to lesson No. 2.
Fit for purpose is about more than data quality.
Data quality is a huge governance issue. There's a well-known computer science phenomenon called GIGO: garbage in, garbage out. It means using poor quality data produces bad results. Poor quality data wreaks havoc. Poor data quality cost the State of California $6 million in excess vacation and sick leave payments. Poor data quality caused the City of San Jose's failure to evacuate its residents during last year's historic floods. In some tragic cases, poor data quality costs lives. But, data quality is not an all-or-nothing thing. Data that is fit for one purpose may be unfit for another. Today as an industry we're just starting to get to a level of maturity where we can think about data quality in terms of its use, not just in terms of its overall accuracy.
But, we need to go much further than that. The Facebook data Cambridge Analytica used may have been of sufficient quality to achieve their ends, but was it legal to use it? Was it ethical? Organizations need to be defining fit for purpose more broadly than "is it possible to use this data for this purpose?" Fit for purpose must also include "is it legal, ethical, and appropriate to use this data for this purpose?" If your organization manages data that is entrusted to you by others, you have an obligation to think about this explicitly. The most mature organizations take this decision out of the line of business and put it with an ethics or governance organization to avoid the conflicts that would otherwise arise.
The bottom line is: Just because you can doesn't mean you should.
Related: Facebook's Brand Is Becoming the Uber of Social Media, and That's Not a Good Thing
Once data is out there, you can't get it back.
At our user conference last year, I said that data is the only commodity that can be stolen from you while it remains in your possession and, unlike something physical, even if you catch the thief you can never be sure you got it back. That applies whether your data is stolen or given away. And like any commodity, the value of data is in what you make out of it. Data gets combined with other data to produce new data, which then gets redistributed elsewhere. In this case, Facebook's raw profile data was used to create psychographic profiles. Even if the raw data is recovered or deleted, those profiles still exist. Who owns them? Further, those profiles were used to make decisions, generate and target content, and take actions. Can those be unwound?
Data moves fast, and once the genie is out of the bottle, you can't put it back in.
Just as organizations need to consider the sources of their data to determine fit for purpose, they must also consider very carefully the uses to which they put their data, especially if the product of that data is shared with others. Some things cannot be "unshared."
Related: Why These People and Brands Are Fed Up With Facebook
You're responsible for your data, even if someone else is responsible for the breach.
Whether you misuse someone else's data, or whether you share data that then gets misused, you will be held responsible by those who entrusted that data to you. That may sound unreasonable, but that's the reality. Facebook's users don't care whether someone violated Facebook's terms of use. They care that their data has been inappropriately harvested -- and they blame the organization who collected, curated and managed that data.
Last month, we heard about 112,000 passports and other ID records exposed on the Web. These records dating from 2009 to 2012 belonged to Bongo International, which FedEx bought in 2014 and shut down in 2017. Who's responsible for this breach? Bongo International? Even though it happened before the FedEx acquisition, the headline still reads "FedEx Customer Records Exposed."
If your organization deals with customer data, you are responsible for it in your customer's eyes whether you collected that data yourself or acquired it from someone else. Think carefully about this as you plan your data strategy and choose your partners wisely.
Related: Here's How to Check If Facebook 3rd-Party Apps Have Access to Your Personal Information
Governance is important.
Finally, a lesson we seem to have to re-learn every time something like this happens. Data governance is important. Really important. There are a lot of fancy definitions of data governance out there, but my definition is simple: Data governance means applying policy to data. It's having clear policy about how you source, manage and use data. It's about communicating that policy. It's about enforcing and auditing that policy.
Many organizations think of data governance as a tax. But, that's the wrong way to think about it. Think about it as an insurance policy. You pay a small price now to avoid a catastrophe later. That's just good business.