Why We Must Expect It To Happen Here
In every part of the world, cybersecurity, data privacy, and digital ethics are very much on the agenda. I’ve noticed that it is a discussion with three main counterpoints, and a plethora of standpoints in between.
First, there’s what I call the privacy first approach. Proponents of this takes the viewpoint that privacy must be protected at all costs, even if this creates inconveniences, slows down progress, or hinders commercial value accumulation.
Second, there’s the commercial approach. This says that privacy should of course be respected, but on the other hand, the development of new digital and physical products, new medicines and treatment forms, and increased commercial growth in society needs access to data. Privacy should be respected mainly because it would be bad for business not to.
Third, there’s the common good approach. This, in turn, doesn’t talk of commercial gain, but of keeping society safe, or of progressing societal development. Yes, we need to protect privacy, it says, but if we have a choice between protecting our citizens, by using data to investigate crimes, or prevent acts of crime or terrorism, we must give the proper authorities access to the data they need. And if data collected and shared with the right people can help us progress society by taking steps against hate crime, child abuse, or even something as simple as improving taffic patterns, then that’s a worthy cause, too. Problem is that a lot of bad things have been done in the name of the common good. If 51 percent stand to gain from something, is it OK to sacrifice the 49?
The societal standpoint is most likely to be adopted by a nation with a very trusting culture, somewhere where trust in authorities, including government, police, and military, is very high. Countries where a sense of societal justice and equality is held at a premium, and where people generally feel safe primarily because they believe that their authorities will protect them. When you argue against the risks of the societal approach,, you’re often met with counter arguments such as “as long as its only the authorities that have access, I’m fine with it”, “if it helps keep us all safe, then I’m OK with it”, or “sure this data could be abused, but that doesn’t mean it will – maybe in some places, but it won’t happen here”.
The commercial approach is typically prevalent in countries and cultures where private enterprise is held at a premium. Where the market is considered a better custodian of people’s interests than authorities, and where wealth accumulation is an end-goal for most. Counter-argue the commercial approach, and typically you’re met with “but it will slow down/stop economic growth”, “it will be bad for business”, “the rest of the world would overtake us in the market”, or the slightly more altruistic “think of all the products you wouldn’t be getting”.
Finally, the privacy approach is the most rare of the three, at least on a national level (every country has it’s privacy proponents). Two countries that come to mind as being places where online privacy is held in high regard, are Germany and France. I was part of trying to introduce Google Street View to Germany, so I experienced the German privacy concern first-hand. As my German boss said to me, ‘the German people have had two systems of government in the past century that both committed atrocities to its people by using surveillance, so we’re naturally weary.’ And, you could argue, the French were invaded by one of those systems of government.
And that, my friends, is why if I ever have to hire a data privacy officer, I’ll want a German. Because both the commercial and in particular the societal approach have tremendous fallacies. The idea that privacy is somehow less important than profit is just plain wrong, and most of us have come to accept that. That’s why we have privacy first products like Tor and DuckDuckGo, and why the EU implemented their data protection legislation. But the societal perspective is actually equally dangerous. I live in Denmark, a country where the trust in public institutions is extremely high, so I’ve heard the ‘yes, it’s my private data, but I trust our government to not abuse it’ argument many times. And my reply is always the same. Maybe you trust your current government, but what about the next one? Or the one after that? Once your data is collected and logged, its extremely hard, if not impossible, to unshare it. You cannot un-ring a bell.
We need to learn from countries where there are still people who remember the sound of boots on the stairs, of pounding on doors in the middle of the night. We cannot expect or hope that it doesn’t happen here. We need to learn from countries where the government cannot be trusted, and where commercial interests are not expected to match the best interests of the people. Not necessarily because that is our reality, but because it may become our reality. And when it comes to my data, I’d rather have control than trust.
You don’t wear a seat belt because a car crash is the most likely scenario every time you go for a drive. You wear a seat belt because in a worst case scenario, it will keep you alive. So we cannot plan our data privacy around a current benevolent government, or a company that won’t share your data with third-parties. All it takes a surprise election outcome (and we’ve had a few of those lately, haven’t we?) or a change of board of directors or management, and the world can be a very different place.
We need to learn from countries where there are still people who remember the sound of boots on the stairs, of pounding on doors in the middle of the night
A prime example of this is WhatsApp. WhatsApp was adopted by many users for its strict data privacy policies and end-to-end encryption. A safer alternative to text messaging, and a more private alternative to Facebook’s Messenger. And then Facebook went and bought WhatsApp . At first, the talking point was that WhatsApp would remain a separate entity and keep their strict privacy policies of no data sharing. A few years later, founders Jan Koum and Brian Acton left the company, both over spats with management over data privacy (the latter launched the Twitter tag #deletefacebook and left behind $850 million in unvested stock options). And now the app will soon be known as “WhatsApp by Facebook”, just like Instagram will sonn be “Instagram by Facebook”. So it’s a pretty safe bet that your WhatsApp data is now a part of Facebook’s data pool on you, and will be part of the next Cambridge Analytica scandal. And personally, I have no idea what has happened with my WhatsApp data from before they were bought by Facebook. But my guess is that it’s already integrated into the Zuckotopia data pool. In hindsight, it would have been much better if WhatsApp had a device-only data storage policy (granted, then there’d be no back-up and no syncing across platforms).
We need to plan for the worst. Rules and good intentions about keeping our data safe and not abusing are all well and good, but I much prefer that neither companies nor governments have my data in the first place, beyond the necessary minimum. It’s much easier to not abuse what you do not have.