Algorithmic Determinism
How AI became the new symbolic order
What does it mean to be an individual?
To have your own style, character, personality traits, habits, and preferences.
To speak in your own words, to form your own thoughts, to make your own decisions.
It’s something that is deeply tied to our idea of free will.
The exercise to choose, to define who you are, to shape your own identity. The belief that our decisions are born from our own intuition and perception of the world.
Over the past century, this question of how identity is formed has sparked countless debates across fields such as philosophy, psychology, and behavioral science.
Each discipline has tried to understand the tension between individual autonomy and the forces that shape our choices, from social structures and institutions to the technologies that mediate our attention and desires.
Today, with the rise of AI reaching into most aspects of our lives (consciously or not), how do we begin to make sense of what it means to be “you”?
Today, I propose some ideas under the following context:
Algorithmic determinism
We are told we are unique, yet our digital footprints are analyzed, categorized, and predicted with startling precision. Are we truly making choices, or are we simply choosing between pre-selected options optimized to capture our attention and modify our behavior?
But most importantly, do we actually care?
My argument is to expand upon the idea that we are living under conditions of an algorithmic ‘Big Other’, a symbolic and technical system that both reflects and shapes our sense of self, industrializing identity under the illusion of choice.
The Illusion of Personalization
Perhaps the most visible expression of this tension is in advertising.
Advertising and personalized recommendation systems are an ever-present tension that feels more prevalent than ever.
Data privacy is a thing of the past. We have all by now blindly agreed to hundreds of terms and conditions, accepted thousands of cookies, and happily filled out our details for countless apps and websites.
To some extent, we all understand how algorithms and advertisements manipulate our habits and preferences. By now, most of us share the intuition that there are some underlying economic and psychological incentives that make AI a trillion-dollar industry.
But what is this abstract figure pulling the levers behind the scenes of our mind?
There’s no secret anymore; we all know that big companies, institutions, and governments use our data to “personalize” and “optimize” our human experience.
Yet what I want to provoke is that it may not simply be any one of these forces that govern and pull these levers. It could be something deeper, a symbolic order that unites them, something that has been best described as “the Big Other”.
Enter Lacan, Žižek, and “The Big Other”
The idea of the “Big Other” was first conceptualized by psychoanalyst Jacques Lacan around 1949.
It refers to a kind of symbolic authority, a hypothetical observer or structure through which our sense of self is mediated.
The Big Other is not any specific person, nor institution, but rather an imagined entity that gives us meaning and validation for our actions, whereby our sense of self is always mediated by an external symbolic order.
Okay, it sounds kind of abstract, I agree, but to put it simply:
We can’t really point to one single force that makes us act the way we do.
There is no obvious unified person or authority telling us how to act or what to do.
Yet, we still act as if there is.
For instance, you don’t really know why you refrain from screaming in public, or why you feel a subtle discomfort breaking certain social norms. No one explicitly told you not to do it; there’s just this intuitive sense that it’s not okay.
That feeling, that sense of an invisible boundary, that is the Big Other at work.
Slavoj Žižek takes this further.
He argues that even if we claim not to believe in the Big Other, we still act as if it exists.
We follow social scripts, seek validation, and conform to invisible standards. The Big Other doesn’t have to be real; it only needs to be believed in.
This is where Žižek’s idea becomes eerily relevant to our digital age.
This could help us to lightly psychoanalyze our behaviour in the sphere of AI and technology.
The Algorithmic Big Other
An abstraction of these thoughts was recently published in an article by Leon A. Salter and Mohan J. Dutta (2025), highlighting the idea of “The algorithmic big Other.”
One of the claims of this paper is that workers (and humans) are increasingly governed not by humans, but by algorithms.
It draws on Žižek’s concept of ideological fantasy :
People act as if they are free, even when they consciously know they are not.
Within the context of target advertising, we enjoy the illusion that we are in control of our preferences, choices, and consumer habits, but intuitively, we know and accept that this is an illusion.
“The big other” is a series of complex Machine learning algorithms that we accept.
We know we are not free in our online choices and preferences, but we enjoy the illusion of control.
And maybe that’s why we don’t care.
Because it’s easier to maintain the illusion of choice than to confront the discomfort of not having one.
The Comfort of Control
Comfort is inherently more desirable than freedom.
Since we have already brought up the infamous Slovenian provocateur and philosopher Slavoj Zizek, let me restate an infamous argument.
“Humans do not actually want true freedom“
The Sublime Object of Ideology (1989)
Žižek suggests that while we may be consciously cynical about systems of power, (we know very well we’re being manipulated, but we still do it), we nonetheless need those structures to maintain a sense of identity and direction.
We all want to be part of a culture with certain customs and a state with certain laws. We do not want the burden of absolute chaos, so we inherently cling to these systems of power.
So, in an age where we are so inundated with vast amounts of information chaos, do we inherently want some sort of symbolic order that guides our choices?
As we are increasingly overwhelmed by a constant stream of ads, memes, news, and noise, perhaps what we truly crave is not freedom, but order.
A symbolic framework that filters and guides our choices
We don’t want to navigate the infinite sprawl of the digital world on our own.
We want to see the world through a kind of tunnel, where excess noise is filtered out and meaning feels restored.
And that’s why, perhaps, we don’t mind that the algorithm has become our new Big Other.
So What Does This Mean for You?
Well, to put it quite frankly. Not much
We’ll continue to exist and evolve through technology, just as we always have. It’s not as if we can reverse the changes. We adapt, the world moves on, and new systems take shape around us.
You must be thinking, “Okay Marko, so why the hell bring it up?”.
Because I think it’s important to learn how to observe our choices, behaviour, and habits; to step back and look at our life from a distance.
To deconstruct how you interact with the world, and in doing so, to make some sense of it.
These ideas really press against the notion of free will, which is something I will continue to try and provoke.
My point is not to make you feel redundant in your attempts to interact with the world, but simply to challenge your ego.
What makes you “you,” in a world where identity is constantly mediated, optimized, and sold back to you?
If we can begin to make sense of these questions, I believe we can gain a better appreciation of the world around us and realize that beneath our differences, we share a common paradigm: we are all shaped by the same invisible architectures.
I hope that didn’t sound too sensationalist. There is, of course, some major existential dread that will follow right alongside these questions.
But I guess that’s also part of the reason you decided to read to the end of this post.
If you enjoyed reading or listening to this newsletter, please like and consider sharing it on Instagram, X, or Facebook.
Much love, yours truly, Markothehuman




