What does it mean to have privacy today, and how is it understood in a world where its meaning is increasingly uncertain? A century ago, privacy was largely associated with physical boundaries (the walls of a home or the closing of a door). Today, it is shaped by invisible data trails, predictive algorithms, and surveillance systems embedded in everyday life. Individuals continue to protect their physical spaces and personal relationships, yet their personal information circulates freely across platforms, applications, and data brokers they may never encounter.

This disconnect raises a fundamental question: when privacy is valued, what exactly is being protected?

Scholars have long argued that privacy cannot be reduced to a single definition. Westin framed it as control over personal information¹, while Warren and Brandeis famously described it as the "right to be left alone"². More recent accounts, such as Solove's, emphasise that privacy includes confidentiality, data security, appropriate use, and protection against harmful information processing³. Marx further argues that privacy and surveillance are not opposites, but intertwined systems of information control that can either protect or endanger individuals depending on context⁴. Regan extends this view by emphasising that privacy is not merely an individual preference, but a collective social value that shapes the conditions for autonomy and democratic participation⁵.

The importance of privacy is further underscored by its legal status. It is recognised as a fundamental human right under Articles 7 and 8 of the EU Charter of Fundamental Rights⁶ and within the United Nations' human rights framework. Yet a clear contradiction remains: while privacy is strongly protected in law, modern technologies and data-driven economic models continue to erode it in practice.

As Zuboff argues, contemporary digital systems extract behavioural data and transform it into predictive products, turning users into sources of raw material within what she terms "surveillance capitalism"⁷. Privacy therefore sits at the centre of a broader struggle between convenience and control, individual autonomy and corporate power.

What's Actually Driving Privacy Erosion?

The value of personal data

Why has personal data become so valuable? According to Zuboff, in the early 2000s many forms of residual data (e.g. spelling mistakes, typing speed, time spent on websites ) were dismissed as "data exhaust": incidental by-products of online activity, stored passively and assumed to have little meaningful use⁷.

This perception began to change when Google's founders realised that large volumes of residual user data could be transformed into valuable insight. Rather than viewing behavioural traces as meaningless waste, Google analysed patterns in search queries and user interactions to refine its ranking algorithms, creating feedback loops in which the search engine learned directly from human behaviour⁷.

In the twenty-first century, this logic spread across the digital economy, leading some to argue that data has become more valuable than oil. Unlike oil, which is finite, personal data is continuously generated through everyday digital interactions⁸. Zuboff describes the extraction of data beyond what is needed to improve a service as "behavioural surplus" which is a surplus that contains predictive insight exploited for profit, with individuals treated not as the product themselves but as sources of raw material⁷.

Although companies often justify the collection of behavioural surplus as a way to improve services, both the derived outputs and the raw data are frequently monetised through sale to third parties for purposes unrelated to the original service which is generally unknown to the individuals involved⁹. The Cambridge Analytica case illustrates this dynamic: data from millions of Facebook users was harvested via a third-party application and shared with external actors to construct psychological profiles for targeted political advertising, without most users providing informed consent¹⁰.

The behavioural surplus model helps explain how many digital platforms can present themselves as "free" while remaining highly profitable.

Can individuals make good decisions about privacy?

Even if many users are at least partially aware that their data is being collected, why do they continue to give up their privacy so readily?

Pokémon Go offers a useful case study. The app, which became a global phenomenon shortly after its release, presents itself as an augmented-reality game in which users move through the real world to collect virtual creatures and earn rewards¹¹. While the game appeared harmless, Zuboff argues that it functioned as a large-scale behavioural experiment. Businesses could pay to place lure modules near their premises, effectively purchasing increased footfall in addition to collecting data. From this perspective, Pokémon Go did not simply predict where players might go, it actively shaped their movement⁷.

However, this interpretation can be challenged for underestimating user agency. Participation remained voluntary, and many users were motivated by enjoyment, social interaction, and physical activity rather than ignorance or coercion. The game can therefore be understood as a reciprocal exchange in which users knowingly trade some degree of data and attention for entertainment and free access. Privacy erosion does not always result from hidden exploitation, it can emerge from conscious trade-offs in which immediate benefits are prioritised over abstract future risks.

TikTok offers a further illustration. It is widely understood that the platform collects extensive user data to train its recommendation systems, yet this awareness has done little to limit its popularity¹². Users are drawn to instant entertainment and social connection, while long-term risks remain distant. Social pressure reinforces this pattern, particularly when platforms become dominant and participation begins to feel necessary rather than optional. In some cases, refusing consent limits functionality or access altogether, turning consent into a condition of participation in digital life¹³.

Alongside this, individuals experience what is often described as "consent fatigue" meaning that they are repeatedly confronted with privacy notices and cookie banners that are complex and difficult to understand, many simply click "accept" in order to access the service¹⁴.

Some individuals appear largely indifferent to privacy concerns entirely, feeling it is already "too late" to protect their data¹⁵. As Solove argues, this often reflects a narrow understanding of privacy as complete secrecy. Privacy is not about hiding information but about how personal data is handled, including whether it is used appropriately and protected from harm¹⁶.

Taken together, these dynamics suggest that while individuals can make privacy decisions that reflect their immediate interests, those decisions are often constrained by limited understanding, behavioural nudging, and structural pressures that undermine genuinely informed choice.

Is it possible to maintain privacy today?

Given the multiple definitions of privacy, there is no simple answer.

As Regan argues, privacy is not limited to individual control over personal information, it shapes the conditions under which individuals exercise autonomy and participate in social and political life⁵. The digital economy, however, places structural limits on such control. Many online services rely on large-scale data extraction, profiling, and behavioural prediction, making consent frequently procedural rather than meaningful. Refusing data collection can lead to exclusion from essential social, professional, or civic platforms¹⁷. Moreover, many core social functions, such as medical care, depend on the collection and processing of personal information¹⁸.

A further challenge arises in the trade-off between privacy and security, where surveillance is often justified as necessary for protection. As Marx argues, privacy and surveillance are interconnected systems of information control that can either safeguard individuals or place them at risk, depending on who holds power¹⁹. Practices such as security cameras in public spaces or identity checks at airports are rarely questioned, as their legitimacy rests on assumptions of proportionality, purpose limitation, and institutional trust²⁰.

There are also cases in which governance design delivers clear public benefits while limiting privacy risks. Transport for London, for example, uses free Wi-Fi to analyse aggregated and depersonalised data on passenger movement, enabling congestion monitoring and improved journey planning. Because the data is processed in a way that prevents individual identification, useful insights can be generated without continuous or personalised surveillance²¹.

Where technical design fails to constrain data collection, responsibility shifts onto individuals, making education a critical component of privacy protection. Although GDPR grants enforceable rights to access, correct, and in some cases delete personal data²², these protections have limited impact if individuals are unaware of them or unable to exercise them effectively.

Under these conditions, the central question is no longer whether privacy can be fully maintained, but whether societies are willing to define and enforce the boundaries that keep it functionally relevant.

Does Regulation Actually Work?

Why do we need laws to protect privacy in the first place? As the previous sections have shown, personal data has become highly profitable. When companies generate revenue by collecting and analysing personal information, there is little incentive to limit these practices voluntarily²³. Privacy protection therefore becomes a form of market failure, what is profitable for firms often conflicts with what is best for users²⁴.

This problem is intensified by a structural power imbalance between individuals and corporations. Consent alone provides a weak foundation for privacy protection²⁵ ^26. Legal rules and regulation are therefore required to set boundaries, limit abuse, and rebalance power.

Similar regulatory approaches have emerged in the United Kingdom and, to a more limited extent, in the United States²⁷. However, these frameworks also impose significant compliance burdens. Survey evidence from PwC suggests that most global companies incur GDPR compliance costs exceeding $1 million annually, with many spending over $10 million²⁸. In some cases, these costs are passed on to consumers or make it harder for smaller firms to enter the market.

Concern over privacy is not confined to Western democracies. In China, data protection law places clear limits on how companies collect and use personal information, while granting the state a strong role in enforcement. Under the Personal Information Protection Law, violations can result in fines of up to 5% of annual turnover or RMB 50 million²⁹. Although this model differs from liberal democratic approaches, it reflects a shared recognition that unchecked data collection creates power imbalances requiring legal constraints.

Despite their ambition, privacy laws often struggle to deliver meaningful protection in practice. Regulations such as GDPR promise accountability and individual rights, yet their effectiveness is limited by uneven enforcement, under-resourced regulators, and practical barriers to compliance. Legal uncertainty further complicates matters, key concepts such as "legitimate interest" or "undue delay" remain open to interpretation, often requiring costly legal advice and producing inconsistent outcomes³⁰. These challenges are compounded by cross-border data flows and the rapid development of technologies such as AI, which create enforcement gaps and regulatory lag³¹ ^32. Smaller firms often struggle with these demands due to limited resources, while larger companies are better positioned to absorb regulatory costs and navigate complex legal environments strategically³³.

Taken together, these factors suggest that while regulation is necessary, it is insufficient on its own.

Conclusion

Protecting privacy is less about stopping data collection altogether and more about setting clear rules for how data is used, for what purposes, and within what limits. The main challenge is deciding where the line between protection and intrusion should be drawn, and who has the power to define and enforce that line in societies that increasingly rely on data.

Laws, enforcement, international cooperation, technical safeguards, and education all play important roles, but none of them is effective on its own. Privacy protection should therefore be understood as an ongoing effort to manage power between individuals, governments, and data-driven organisations, rather than as a single rule that can solve the problem.

Instead of aiming for complete privacy (which is unrealistic) a more practical goal could be data minimisation: limiting data practices to what is necessary and proportionate. This can be supported through techniques such as data sanitisation³⁴, end-to-end encrypted communication³⁵, and on-device data processing³⁶, all of which reduce the amount of personal data that is stored, exposed, or shared. Another approach is to treat personal data as something of economic value, where individuals are compensated for its use.

Whatever approach is taken, transparency remains a basic requirement. Without it, consent loses its meaning, power imbalances remain, and privacy risks becoming something that exists only in law rather than in everyday life.

References

  1. Judith DeCew. Privacy (Stanford Encyclopedia of Philosophy). 2018. https://plato.stanford.edu/entries/privacy/
  2. Warren and Brandeis. The Right to Privacy. 1890. https://groups.csail.mit.edu/mac/classes/6.805/articles/privacy/Privacy_brand_warr2.html
  3. Daniel J. Solove. "A Taxonomy of Privacy." University of Pennsylvania Law Review 154.3 (2006), pp. 477–564.
  4. Gary T. Marx. "Surveillance Studies." International Encyclopedia of the Social & Behavioral Sciences, 2nd ed. Oxford: Elsevier, 2015, pp. 733–741.
  5. Priscilla Regan. Privacy and the Common Good: Revisited. Cambridge: Cambridge University Press, 2015.
  6. Gloria González Fuster. Study on the Essence of the Fundamental Rights to Privacy and to the Protection of Personal Data. EDPS, Nov. 2023.
  7. Shoshana Zuboff. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Profile Books, 2019.
  8. Dr Beenish Shakeel. Is Your Personal Data Worth More Than Oil in Today's Economy? Medium, Aug. 2025.
  9. Shoshana Zuboff. Surveillance Capitalism and the Challenge of Collective Action. New Labor Forum, Jan. 2019.
  10. BBC News. "Cambridge Analytica: Facebook boss summoned over data claims." Mar. 2018.
  11. Web Wise. What is Pokemon Go? Explainer Guide. July 2016.
  12. Aparajita Bhandari and Sara Bimo. "Why's Everyone on TikTok Now?" Social Media + Society 8.1 (Mar. 2022).
  13. Anja Stevic. "Under Pressure? Longitudinal Relationships between Different Types of Social Media Use, Digital Pressure, and Life Satisfaction." Social Media + Society 10.1 (Jan. 2024).
  14. Safna. Consent Fatigue: What Is It and What Should Businesses Do? CookieYes, June 2025.
  15. Lauren Hendrickson. Digital Identity Myths You Probably Still Believe. Identity.com, July 2025.
  16. Daniel J. Solove. "'I've Got Nothing to Hide' and Other Misunderstandings of Privacy." San Diego Law Review 44.3 (2007), pp. 745–772.
  17. Jonathan Zong and J. Nathan Matias. "Data Refusal from Below." ACM Journal on Responsible Computing 1.1 (Mar. 2024).
  18. Utrecht University. Data Privacy or Data Protection?
  19. Gary T. Marx. The Kaleidoscope of Privacy and Surveillance. 2015.
  20. Kevin Macnish. "An Eye for an Eye: Proportionality and Surveillance." Ethical Theory and Moral Practice 18.3 (Aug. 2014).
  21. TfL. TfL to use depersonalised Wi-Fi data collection technology. TfL Press Release, 2019.
  22. Intersoft Consulting. Art. 17 GDPR — Right to erasure ("right to be forgotten"). GDPR Info, 2013.
  23. Kieren Williams. How much your data is worth — and how to stop people profiting from it. Sky News, Dec. 2025.
  24. American Antitrust Institute. Privacy & Antitrust at the Crossroads of Big Tech. 2021.
  25. Simon Gompertz. "Customers 'bewildered and fearful' about use of their data." BBC News, Sept. 2016.
  26. Josh Fuqua. The Illusion of Consent: Rethinking Privacy Online. GSU Law Review, Apr. 2025.
  27. DLA Piper. Data Protection Laws in the United Kingdom. Data Protection Laws of the World, 2024.
  28. PwC. A privacy reset — from compliance to trust-building. 2021.
  29. PricewaterhouseCoopers. Privacy: Searching for guidance in China's legal landscape. PwC China Compass, 2021.
  30. Helena Vieira. Why is GDPR compliance still so difficult? LSE Business Review, July 2025.
  31. OECD and Korea Development Institute. Case Studies on the Regulatory Challenges Raised by Innovation and the Regulatory Responses. OECD Publishing, 2021.
  32. Ishmeet Matta. The Role of Regulatory Bodies in AI Governance and Oversight. Sogeti Labs, Oct. 2024.
  33. FreshFields. Rising risks and shifting rules for international data transfers. 2026.
  34. NCSC. Secure sanitisation of storage media. Sept. 2016.
  35. IBM. End-to-end encryption. Sept. 2021.
  36. EDPS. On-device artificial intelligence. Sept. 2025.