“Data is the New Oil” as an idea really took off in 2016, touching on the modern value of a business revolving around data collection, of the value of their users or the demographics of their customers. We have framed it as the hidden value we didn’t know we had, waiting for us to unlock it.
However, in the light of the reveal of Facebook allowing Cambridge Analytica deep access to their treasure trove of data, explicitly to achieve ends that are deeply concerning to modern democracy and the health of the Internet at large, “data is the new oil” requires considerable re-examination.
For Facebook, this was their business model. It was always their business model. Allowing access to their data was undertaken with the full support of their stated goals, of what they set out to do.
This was their intent, and the consequences belong solely to themselves.
So when we pursue “data is the new oil”, when we seek to unlock our own hidden value, how will we be different from Facebook? How can we learn from their example?
Data is powerful, offering deep meaning and insight never before accessible, but it comes with new concerns and requirements. An oil spill is a catastrophic consequence, a necessary consideration of its use.
But to call data the new oil, we must also understand that data is as dangerous as oil and consider ideas such as data leaks or data spills, and describe the catastrophic events that are as much a consequence and consideration of the use of data as of oil.
When we then pursue the value of data, we must build organisations and policies with this in mind, that assume adversarial usage and intent, that believe that the behaviours of Cambridge Analytica are not aberrations or outliers, that this is the norm. To understand that Cambridge Analytica is the organisation that got caught, not the only organisation behaving this way.
We must believe that, without a robust organisation, without that consideration for adversarial intent, that data spills are inevitable.
We must ask, what results do we want? Do we value the privacy of our customers and users? We must ask, if we truly value them and their privacy, will an agreement not to misbehave ever be enough?
We must ask these questions because there is no way to unleak the data, no way to clean up a data spill. Consequences, regardless of our intent or stated values and goals, will remain consequences.
To use data as our new oil, we must realise the value of data without creating the conditions for privacy breaches. We must consider the data spill and work tirelessly to prevent it. We must analyse our data ourselves, and offer only our own interpretations. We must build our organisations such that our people care about safety and privacy, that the organisation promotes safety and privacy, that it makes violating safety and privacy difficult.
We must do more than say we care, we must build a system that cares, that will care when we are not watching.
And we may think that we are too small to notice, that we will not be that target, but there is no such thing as a target too small to notice. In the new world, the value of data grows when it is combined with other data, our data included.
Cambridge Analytica targeted a giant, and succeeded. We will be targeted as well. Will our data adversaries succeed against us?
Cambridge Analytica has shown us the future of “data is the new oil,” and the public concern over that future has shown us the questions that we must ask, the culture that we must strive for.
This is our future.
Are we ready for it?