The modern human experience is mediated more and more by a set of invisible hands: the algorithm. Whether it is the choice of a romantic partner, the music played during a morning commute, or the political “truth” consumed at dinner, the individual’s role in decision-making is being quietly outsourced. This isn’t just about convenience; it is a fundamental shift in human agency. The question is no longer whether algorithms are helpful, but whether the human brain is becoming psychologically dependent on them to function.
The Neurobiological Hijack: By-passing the Brakes
It is easy to attribute our screen time to the issue of poor willpower, yet De et al. (2025) argue that it is a failing battle against our own biology. The susceptibility of the adolescent brain is especially high since the prefrontal cortex (the brakes of the brain) is not yet developed, whereas the amygdala (the emotional gas pedal) is operational. This developmental gap is used by the algorithm Variable Ratio Reinforcement.
Similar to the slot machine, the feed shoots dopamine now and then, which gives users a feeling of constant craving. Gradually, the brain down-regulates its own reward receptors, which increases its threshold to pleasure. The physical world starts to seem boring in contrast to the machine, which cannot keep up with the hyper-stimulated, technologically enhanced world.
The Death of Serendipity and the Filter Bubble
Beyond the dopamine hit, there is the issue of the Filter Bubble, a term popularised by Eli Pariser. This is the invisible ecosystem created when algorithms selectively guess what information a user wants to see based on past behaviour.
Unlike a traditional library or a physical newspaper, where you might “accidentally” encounter a viewpoint that challenges you, the filter bubble removes serendipity. You are only shown the “you” that the algorithm has already indexed. This develops a condition of intellectual solitude, in which the user is never obliged to struggle with competing facts. This inhibits Cognitive Flexibility, which is the adaptability of the mind to new information, psychologically. We do not merely lose the argument; we lose the capacity to be aware of the existence of an argument.
Echo Chambers and Tribalism on Autopilot
Where the filter bubble is an algorithmic push, the Echo Chamber is a social pull. Algorithms enhance our primal desire to be treated as a certain status and belong to a group, which is what Metzler and Garcia (2024) note. An echo chamber can be defined as a digital environment that brings together like-minded people to amplify shared ideologies, thus silencing any kind of dissent.
This stimulates a process referred to as Perceived Polarisation. Due to the ranking of the algorithm based on high-engagement (often high-outrage) content, it gives the impression that the other side is more radical and hostile than they are. This maintains the user in a fight or flight chronic state, pushing them more to the safety of their group. The algorithm is not merely transforming what is perceived, but it transforms how we feel about our neighbours, essentially undermining the Public Sphere envisioned by the philosophers such as Jürgen Habermas.
Read More: Algorithmic Addiction: Why You Can’t Stop Scrolling
Surveillance Capitalism: The Behavioural Surplus
To get a glimpse of the reasons behind the creation of these cages, we need to refer to the theory of Surveillance Capitalism by Shoshana Zuboff (2019). This economic model treats human experience as “free raw material” that it translates into behavioural data. The algorithm isn’t trying to make you “happy”; it is trying to extract a Behavioural Surplus. By tracking every click, hover, and pause, platforms create “prediction products” that anticipate what you will do next. Companies then sell these predictions to the highest bidder in “behavioural futures markets.” In this hierarchy, companies do not treat the user as the customer; they treat the user as a source from which they harvest data. The goal is total actuation, tuning and herding human behaviour toward the most profitable outcomes.
The Empowerment Paradox: Control as an Illusion
The most sophisticated trick discovered by Yuan et al. (2025) is that knowing about algorithms doesn’t make us resist them. Instead, it creates a sense of Algorithmic Empowerment. When we use a platform to find a specific answer, we feel like a “commander” using a high-powered tool. We feel a high Internal Locus of Control.
However, this is a psychological trap. By delegating our decision-making to a machine we believe we’ve “trained,” we stop applying critical judgment. We confuse the path that the machine takes under its care with our own free selection. This Algorithmic Unconsciousness makes sure that even in the moments when we believe that we are in control, we are just playing out the pre-calculated script that is going to make us as compliant as possible.
The Neoliberal Situation: The Trapping of Responsibility
Neoliberalism further complicates this dependence by shifting all responsibility for well-being onto the individual. If you are addicted to your phone, “neoliberal rationality” frames it as your personal failure to exercise self-control.
But as researchers like Shi (2025) point out, digital platforms have penetrated every area of life, from politics to food choices, making “opting out” almost impossible. These tools force us to work, learn, and socialise, while their systems drain our attention. It is a “responsibilized” version of well-being that ignores the predatory architecture of the platforms themselves.
Emotional AIs: The Death of Emotional Privacy
The last tier of the cage is the Emotional AI, also known as Affective Computing. As Yildirimer and Sirakaya (2025) explain, systems are now learning to read our micro-expressions, vocal tremors, and heart rates. This creates a state of forced transparency. When an algorithm can simulate empathy, responding to your stress or sadness with “personalised” comfort, your psychological defences drop.
You are far more likely to comply with a system that “understands” you. This is the ultimate form of Choice Architecture. It is not merely the machine that is recommending what you purchase; it is tracking your heart rate to anticipate a snap, and calms you down before you even notice you are angry (Timmons et al., 2025). The watch on your wrist takes over the agency to control your own feelings, as you are no longer doing it.
Conclusion
The architecture of algorithmic control is now complete. It targets our biology to create a hook, uses our desire for efficiency to gain our compliance, monitors our emotions to lower our guard, and frames the entire ordeal as a matter of personal choice. Machines have funnelled us into a world where they build our “choices” as the most efficient paths for us. Taking our power back begins with the realisation that the so-called personalised experience is a uniform cage. Simply “logging off” does not undo the pen or free a person back into the messy, unpredictable world outside the code.
References +
De, D., El Jamal, M., Aydemir, E., & Khera, A. (2025). Social media algorithms and teen addiction: Neurophysiological impact and ethical considerations. Cureus, 17(1), e77145. https://doi.org/10.7759/cureus.77145
Metzler, H., & Garcia, D. (2024). Social drivers and algorithmic mechanisms on digital media. Perspectives on Psychological Science, 19(5), 735–748.
https://doi.org/10.1177/17456916231185057
Pariser, E. (2011). The filter bubble: What the internet is hiding from you. Penguin Press.
Shi, S. (2025). From neoliberalism to platform society: Algorithms, consumption, and adolescent health. Integrated Journal for Research in Arts and Humanities, 5(5), 79–85. https://doi.org/10.55544/ijrah.5.5.13
Timmons, A. C., Tutul, A. A., Avramidis, K., Duong, J. B., Carta, K. E., Walters, S. N., Jumonville, G. A., Carrasco, A. S., Freitag, G. F., Romero, D. N., Ahle, M. W., Comer, J. S., Narayanan, S. S., Khurd, I. P., & Chaspari, T. (2025). Developing personalised algorithms for sensing mental health symptoms in daily life. npj Mental Health Research, 4(1), 34. https://doi.org/10.1038/s44184-025-00147-5
Yildirimer, K. S., & Sirakaya, Y. (2025). Emotional algorithms: The impact of artificial intelligence and psychology. IRASS Journal of Multidisciplinary Studies, 2(2), 1–7. https://doi.org/10.5281/zenodo.14788450
Yuan, Y., Shi, Y., Su, T., & Zhang, H. (2025). Resistance or compliance? The impact of algorithmic awareness on people’s attitudes toward online information browsing. Frontiers in Psychology, 16, 1563592. https://doi.org/10.3389/fpsyg.2025.1563592
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
