Ana Catarina De Alencar
Technology Law | AI Governance & Compliance | IT, SaaS & Commercial Contracts | Data Protection (GDPR, EU AI Act) | Intellectual Property
May 12, 2025
Imagine a world where staying alive depends on your subscription plan. An existential “freemium” model, where health, consciousness, and the most essential aspects of life are mediated by digital platforms. This is the premise of Common People, the first episode of the seventh season of Black Mirror. Although fictional, the dystopia is anchored in observable dynamics of contemporary relationships between technology, markets, and subjectivity.
The narrative follows Amanda, a schoolteacher diagnosed with a brain tumor, whose consciousness is transferred to the servers of a corporation called Rivermind. Her husband, Mike, accepts the offer of an innovative technology that promises to extend her life in a digital environment. But this extension depends entirely on the family’s financial ability to maintain a monthly subscription. The service Rivermind offers is a tiered model: basic, plus, and premium, each granting different levels of access and restrictions. Life begins to operate under the logic of consumption and service segmentation.
The episode’s central tenet is not merely the existence of the technology itself, but how it is implemented within an exploitative business model. Rather than representing an innovation aimed at social well-being, the solution provided by Rivermind symbolizes a platform project that commodifies consciousness itself. The dematerialization of subjectivity is accompanied by its recoding into monetization.
The platformization of life
There’s no denying that today’s technology makes our lives easier, but what are the risks of relying completely on technology for our daily existence? And when we become the product of that technology, how does that impact our rights as humans?
The concept of “platformization of life” helps make sense of this phenomenon. More than just a technical means, platforms become socio-economic infrastructures that shape our ways of living. By concentrating access, data, and essential services, they begin to replace classical institutional structures (in reality), as well as the way we perceive them.
Imagine this: you go to a birthday party and, when it’s time to open the presents, a teenage girl begins to present each gift as if she were in a TikTok livestream. She places her palm behind each item and shows them in slow motion to the guests, mimicking influencer behavior. Someone in the room, puzzled by the scene, asks: “Are we in a live stream right now?” The teenager doesn’t understand the question. For her, TikTok has unconsciously shaped her behavior to the point where acting as if she were inside a platform feels completely normal. Life is being platformized moment by moment, as our subjectivity is gradually captured and molded by technology.
An institutional void: where rights go to die
In the episode, no regulatory mechanism, legal mediation, or institutional guarantees are observed. The asymmetry between user and service is total, and the judicial system is entirely absent. This institutional void suggests a scenario of hyper-privatization, in which even the most basic rights, such as healthcare, privacy, and cognitive liberty, are subject to adhesion contracts and opaque algorithms.
In the show, no character turns to lawyers, associations, or courts. Rivermind operates under a regime of absolute power, with no accountability. This critique mirrors certain aspects of contemporary reality, particularly in the United States, where healthcare plans are notoriously expensive, exclusionary, and complex, often leaving citizens indebted or forced to mortgage their assets to access essential treatments.
In this institutional void, the company infringes even further on Amanda’s rights. Her progressive loss of autonomy is illustrated poignantly when she begins to involuntarily insert adverts into her speech. This represents a symbolic and cognitive violation, raising serious questions about the boundaries between advertising, subjectivity, and neuro-rights. Neuro-rights form an emerging field that seeks to protect mental integrity and cerebral privacy in the context of neural technologies and brain-computer interfaces. In Amanda’s case, her digitized consciousness is invaded by commercial stimuli she cannot control, turning her mind into ad space. The absence of consent and legal redress highlights the urgent need to design regulations specific to the era of cognitive technologies.
The slippery slope from vulnerability to exploitation
As the story continues, the situation only worsens when Mike, in an effort to keep Amanda’s subscription active, begins livestreaming himself on an “extreme entertainment” platform, performing self-degrading acts for an anonymous audience. This phenomenon reflects real-world practices seen on digital platforms, such as “NPC livestreams” or broadcasts involving extreme eating or sexual behavior.
Platforms often fail to intervene promptly in harmful content like that, with viewers often responding with amusement rather than concern. While some argue that these livestreams or other online activities offer financial relief for people in vulnerable situations, they also expose individuals to long-term harm. For instance, many women report having transformed their financial lives for the better through platforms like OnlyFans, where they share intimate content.
Yet these activities, framed as opportunities, often operate as exploitative cycles. People struggling to survive turn to degrading digital labor, which not only imposes psychological costs but can also severely compromise their future prospects. Once such information becomes public, such as being labeled as an OnlyFans model or a compulsive gambler, social stigma and algorithmic profiling can drastically limit one’s chances of securing traditional employment or social reintegration, much like what happens to Mike in the episode.
Structural lock-in: The monopoly trap
The absence of competition in the episode underscores a scenario of extreme monopoly. Rivermind is the only available option, and its terms are non-negotiable. This is a case of structural lock-in, in which users are not only dependent on the service but are trapped by it. In our contemporary reality, we see similar trends with dominant tech companies that centralize power over data, infrastructure, and interfaces, making them nearly impossible to replace.
On the regulatory front, there have been recent efforts to curb the power of platforms. The European Union, for example, adopted important legislative frameworks, such as the Digital Services Act (DSA), the Digital Markets Act (DMA), as well as the EU AI Act. The DSA aims to hold platforms accountable for the societal risks their systems pose, promoting transparency and protection of fundamental rights. The DMA targets anti-competitive practices and seeks to ensure that large platforms do not abuse their market dominance. Yet these initiatives face challenges in a global environment marked by deregulation and a new geopolitical “tech race”, particularly among China, the US, and the EU, over control of digital technologies.
We’re only human after all
In the episode’s conclusion, Amanda, reduced to a functional fragment of code, eventually requests that her existence be terminated. However, her decision is not one of pure despair, but of lucid recognition of her imposed condition — there is no viable future for her within that system.
Common People is not an indictment of technology itself, but a sophisticated examination of the dangers in allowing technical systems, when coupled with deregulated market forces, to replace institutional structures of care, protection, and justice. It invites us to think about neuro-rights, antitrust regulation, digital sovereignty, and the urgent need to build legal and political frameworks capable of safeguarding human dignity, and humanity as a whole, in the face of ongoing technological transformation.
This article was written by Ana Catarina De Alencar and Lindsay Langenhoven two enthusiasts of technophilosophy, law, and AI ethics.
Leave a Reply