Scale-free correlations.

Surveillance in Augmented Reality: Snapchat, Zuboff, and the Ethics of the AR Gaze

CategorIes:

By

·

3–5 minutes

Essay · December 2024 · SUST 5100: Ethics and Technology, Schulich School of Business

Snapchat’s Augmented Reality tools — Lens Studio and Spectacles — represent a category of innovation that deserves far more scrutiny than it receives. These tools are marketed as instruments of creativity and self-expression. Their actual function is something else: the systematic extraction of biometric, behavioural, and location-based data in service of Snap Inc.’s advertising business.

This essay argues that Snapchat’s AR tools are not neutral innovations. They are instruments of surveillance capitalism — a term Shoshana Zuboff uses to describe the exploitation of human experience as raw material for data extraction, behavioural prediction, and commercial profit. Drawing on Zuboff’s framework and Naomi Klein’s critique of commodified identity, I examine how AR technologies deepen asymmetries of power, erode individual autonomy, and prioritize profit over ethics.

The Technology and Its Ethical Stakes

Lens Studio is a creator platform where developers design AR filters that interact with users’ facial features, movements, and behaviours. Spectacles are AR-enabled smart glasses that capture video and audio while integrating digital overlays into physical environments. Both require continuous access to biometric and geolocation data to function.

Snapchat markets these tools as promoting creativity and real-time engagement. What they actually do is extract significant quantities of user data — biometric facial scans, emotional responses to AR filters, usage frequency, geospatial movements — and feed that data into predictive algorithms that drive targeted advertising. Users’ experiences fuel the system. The experience of creativity is the product being sold.

Three ethical questions follow. How does Snapchat’s AR technology operationalize surveillance capitalism? Is user consent meaningful when data extraction is obscured by design? And are Snap Inc.’s mitigation efforts substantive, or performative?

Behavioural Surplus and the AR Lens

Zuboff’s central concept — behavioural surplus — is key here. AR lenses require facial biometric scans to function. Those scans are stored, analyzed, and repurposed to generate predictive insights that go well beyond the filter itself. Spectacles, with their recording capabilities, capture location-based data and real-time behaviours, producing an even richer data pool. The asymmetry is the point: corporations hold exclusive control over data flows, while users remain unaware of the true scope of what’s being extracted.

Klein’s critique of technological commodification deepens this analysis. Modern technology doesn’t just commodify behaviour — it commodifies identity. Snapchat’s AR tools blend personal identity with corporate branding through sponsored lenses and filters. When a user applies a Nike or Coca-Cola filter, their face becomes an advertising canvas. Self-expression becomes a vehicle for corporate gain. And the psychological implications are real: AR filters that perpetuate unattainable beauty standards contribute to body dysmorphia and mental health struggles, particularly among young users.

The Performative Safety Response

Snap Inc. has introduced tools like Family Center and content reporting mechanisms. These address surface-level risks — harmful content, illegal imagery — but not the systemic issue. The core ethical problem is not the content users see. It is the extraction of personal data to fuel engagement-driven business models. Snap’s mitigation efforts leave that untouched.

Zuboff identifies this as a hallmark of surveillance capitalism: a strategy designed to maintain public trust while ensuring exploitative practices remain unchallenged. The rapid expansion of Snapchat’s AR features — with engagement metrics as the primary design priority — reinforces this critique. Users, particularly teenagers, are drawn into immersive AR experiences with little awareness of the underlying trade-off: participation in a system that commodifies their behaviour, identity, and privacy.

What Real Solutions Require

Remedies must go beyond the performative. Four directions matter.

Transparent data ownership. Users should have clear visibility into how their AR data is collected, stored, and monetized. Consent mechanisms must require explicit, real-time user approval before any data is extracted or shared — not buried in terms of service designed to be ignored.

Independent algorithmic auditing. Third-party audits of Snapchat’s predictive systems can ensure compliance with ethical standards and prevent the misuse of biometric and behavioural data. Accountability requires external scrutiny, not just internal policy.

Ethical AR design. AR features should be designed in collaboration with psychologists and ethicists, not just engineers optimizing for engagement. Filters that promote positive self-image are possible. The choice not to prioritize them is a design decision, not a technical constraint.

Regulatory safeguards. Governments need data privacy laws specific to AR technologies — laws that establish what biometric data can and cannot be collected, stored, and used commercially. The current regulatory environment has not kept pace with the capabilities of these tools.

Conclusion

Snapchat’s AR tools are a useful case study precisely because they are presented as benign — fun, creative, generative. That framing obscures what they actually are: infrastructure for surveillance capitalism. The ethical concerns are not hypothetical. The power asymmetries are real, the data extraction is active, and the mitigation efforts are insufficient.

Real solutions require a paradigm shift: transparency, data ownership, ethical design, and regulatory oversight. Not as PR strategy. As structural requirement.


Written for SUST 5100: Ethics and Technology, Schulich School of Business, York University. December 2024.

Leave a comment