Why privacy matters (even if you have nothing to hide)?
The largest fine ever issued under privacy laws was made against Amazon by the Luxembourg regulator CNPD. The complaint, made on behalf of 10,000 customers, focused on manipulating customers through targeted marketing based on their data: the final fine was 746 million euros ($888 million).
Privacy has been in the news a lot in the last few years, but
- Does privacy mean anything to the person on the street?
- Does it matter if an organization shares your data with another organization?
- Or use it to tailor your ‘web experience’ to make online life easier to navigate?
- Is it worth sharing personal data to create more informed relationships with companies you want to do business with or for healthcare reasons?
When I talk to friends and family about data privacy these sorts of questions come up, and they are usually followed by a variant of "well, I have nothing to hide so why should I care about data privacy".
Here, I will try to give you some reasons to care and why privacy matters.
Table of Contents
The concept of privacy is not something that came about because of recent data privacy regulations like the GDPR.
Privacy is an inherent quality of human interactions and relationships.
Looking back at the history of human beings, the notion of a ‘private life’ has shown up in the architecture of our cities, political manifesto’s and now, more recently, in our digital lives.
Privacy is not about hiding things away; it is about controlling how those things are used. In the past, those ‘things’ may have meant private spaces, like a home. In the context of the modern world, this extends to the data that represents us.
Back in 1985, before the internet entered became ubiquitous, a student, named Larry Hunter, wrote a chilling prophecy in Whole Earth Review:
Without any conspiratorial snooping or Big Brother antics, we may find our actions, our lifestyles and even our beliefs under increasing public scrutiny as we move into the information age.
Privacy is not security, but the two are intrinsically linked.
Having control over what happens with personal data is at the forefront of privacy respectful data transactions, but it is much more nuanced. Within a technological context, privacy is about the lifecycle control of digital data by the owner of those data. Ownership of data is where it gets messy.
Do you own your passport data, or is it owned by the government that issued it?
Technology has added a layer of complexity to the visibility and ownership of digital data that can make simple concepts such as 'consent' that underlie data regulations globally unachievable.
If privacy is about control, then ensuring that technology is imbued with functions that control privacy is foundational. However, technologies such as AI and the expanding network of disparate endpoints mean that control of personal data is infinitely difficult.
There are three key areas where privacy is being leaked, misused, or ignored.
A 2022 report by pCloud into invasive apps found that 52% of apps share personal data with third parties, Instagram being the worst offender with 79% of the data collected by the app being shared.
The era of creepy tech has surpassed itself; these data are shared with companies such as Hootsuite and BuzzSumo, who then use these data to target market to online users.
Image source – pcloud.com
If you provide data to a company or organization, you should expect that any use of that data is under your control.
Consent is a fundamental tenet of privacy.
Collecting consent to process data develops a respectful relationship and engages the data owner in the process. Many data regulations across the world use consent as a legal basis for privacy. But consent is not always delivered.
A 2022 study from Macquarie University published in the British Medical Journal (BMJ) found that 88% of the 20991 mHealth apps allowed access to personal data.
In 87% of cases, the data collected was on behalf of third-party services such as advertisers and tracking providers, informed consent was lacking in many of the apps.
Image source – bmj.com
The mix of 'government and privacy' is highly contentious. The now infamous case of the National Security Agency (NSA) and the whistle-blower Edward Snowden raised the profile of digital privacy. Snowden released files that revealed the NSA was purposely collecting citizen data via a program called Prism.
These data were collected without the knowledge of US citizens. But even this outing of government snooping has not stopped government entities from using technology to snoop on citizens.
In the UK, a controversial law is known as the "Snoopers Charter" has led to an investigation into two unnamed internet companies allegedly carrying out tests on a surveillance tool that collects metadata on an individual’s online life. The data under observation can include political views and the health of an individual.
Tech giants, like Facebook, have engaged in government privacy debacles. The Facebook/Cambridge Analytica incident became the poster child of how a government can use everyday technology to turn “data into votes”; a government using personal data to target market to citizens in the same way that commercial organizations do.
Image source – theguardian.com
Facial recognition (FR) is a technology that is stoking privacy concerns across the world. Governments use FR for a variety of uses, from border control to law enforcement.
A 2022 survey from the United States Government Accountability Office (GAO) found that almost half of the federal agencies that employ law enforcement officers use bespoke facial recognition systems or as provided by Clearview AI or Amazon Rekognition.
Worryingly, the report points out that several agencies don’t know how that facial-recognition technology is used or the extent to which third parties are collecting/processing the data.
Once our data is in the hands of a technology firm, a government can also make requests for these data.
According to research from Surfshark, between 2013-2020 there were 3,067,228 government data requests made to Facebook, Apple, Google, and Microsoft. Governments also use ‘gag orders’ to prevent individuals from knowing who has requested their data. Even with the best efforts, technology providers may be required under law to provide these data.
Image source – surfshark.com
The Covid-19 pandemic has brought with it privacy discussions along with healthcare concerns. The development of track and trace apps brought to light the problems of centralized data collection leading to potential government surveillance.
In the UK, the original centralized app was quickly replaced by a decentralized version when privacy issues surfaced. Finding the balance between managing a pandemic and maintaining privacy is one of the key issues in the privacy debate.
Poor privacy deployment and design led to the exposure of the personal data of around 540 million Facebook users in 2022.
The problem lay in a poorly thought-out feature called “Contact Importer”. Profiles set to “public” or “share with friends”, and with allow a lookup using a phone number, meant that screen scraping could be used to steal data.
Technology cannot add privacy as an after-thought. Privacy is a whole system consideration. This has been translated into seven principles of “Privacy by Design” by the ex-privacy commissioner for Ontario, Ann Cavoukian. As apps and systems are scoped, designed, and developed, privacy must be part of that process, otherwise, it will not be functional in the final product.
Image source – iapp.org
One of the fundamental areas where privacy issues slip in is the coding process. Insecure coding leads to software flaws that provide a way for data to leak or be exploited.
In 2020, the number of common vulnerabilities (CVEs) in code increased by 5.6% with 37 billion data record were exposed. Many of these data were personal and sensitive.
In the 6-months to July 2021, the number of CVEs is already more than half of the 2020 figures.
With CVEs increasing, we should expect more of the same level of data breaches.
The problem with privacy is that it requires attention across the entire system. Amongst the many variables that impact good privacy are:
- The proliferation of devices (including the IoT): In 2022, there are around eight billion mobile subscriptions. The number of IoT devices (including industrial IoT) is likely to reach around 37 billion by 2025. Data flows across all of these devices, and as the numbers grow, the probability of a data leak or a misuse of data increases.
- Visibility of data: All these devices lead to data flowing more freely; data is out there, but how do you contain it if you don't know where it is? Visibility across disparate endpoints is a major factor in enforcing consent and data privacy.
- Appetite and understanding: Putting structures to attain privacy is a financial burden and one where a nuanced understanding at the organizational level is needed. Privacy is a board-level, down, discussion.
- Insecure coding practices: Poor privacy by design leads to inherent privacy issues in a system. Like a Russian Doll, one error opens another, often exacerbating the flaw as the system is used.
The numbers speak for themselves: a 2022 study from Entrust on consumers views on privacy had some interesting findings:
- 79% of consumers are “somewhat concerned” about data privacy.
- 64% said their concern or awareness about data privacy has increased over the past 12 months.
- The reason for these concerns in 60% of consumers was due to security breaches, with 48% being worried because of targeted ads based on their online behavior.
- Only 21% of consumers trust established global brands to keep their personal information secure.
Where once people might have said "I have nothing to hide so why should I care about privacy?".
Now research shows that people do care.
Apps that leak data or have privacy practices that are less than respectful are no longer tolerated. The WhatsApp privacy update being a recent example. WhatsApp was held to account for privacy changes that may have meant parent Facebook could access personal data of WhatsApp users.
Privacy matters. It is important because it reflects the trust level of relationships. It is important because it helps to protect against the misuse of data. It is important because good privacy is the foundation of the right to control your digital life.
Five ways that you can help improve your own privacy:
- Use a VPN and private browsing: browsers such as Chrome collect browsing data and use it to market to you. Use a private browser, and for additional protection, use a VPN.
- Set social media privacy: make sure your social media account settings are configured to give you as much privacy as possible.
- Don’t overshare data: if you don't need to give information when completing forms or on social media posts, etc., then don't.
- Check mobile permissions: some apps can access contacts and photos on your mobile. Make sure the permissions are set to off if you do not wish to share.
- Avoid using public Wi-Fi: public Wi-Fi does not protect data traffic and can be used to steal credentials and other sensitive data. Avoid public Wi-Fi and/or use a VPN to encrypt web traffic.