Humans in the Loop: AI, Labour, and the Politics of Representation in Digital Culture
Algorithmic Culture and Cinematic Resistance
We attended a film screening organized by Dr. Dilip Barad Sir, and it turned out to be much more than a simple viewing session. It became an academic and reflective experience. The discussion around the film encouraged us to move beyond passive watching and begin analyzing cinema as a serious cultural text. I started observing narrative techniques, visual framing, symbolism, and character positioning more carefully.
This screening helped me understand that films are not just entertainment; they are powerful cultural documents that reflect social realities, political structures, and human emotions. The experience sharpened my critical thinking and deepened my understanding of how cinema engages with contemporary issues—especially technology and power.
Task 1 – AI, Bias & Epistemic Representation
A Critical Reflection on Humans in the Loop
Introduction: The Myth of Neutral Technology
In today’s digital world, Artificial Intelligence is often described as logical, objective, and free from human error. It is presented as a neutral machine that simply processes data. However, Humans in the Loop challenges this comforting belief. The film argues that AI systems are not independent from society; rather, they are shaped by the same cultural values, inequalities, and power structures that exist in the real world.
Instead of showing AI as a futuristic miracle, the film reveals it as a human-made system deeply influenced by ideology. It makes us question: If humans create and train AI, how can it ever be free from human bias?
Through documentary-style realism and thoughtful cinematic construction, the film exposes how algorithms inherit social prejudices. It suggests that technology does not stand outside culture—it grows from it.
1. The Illusion of Objectivity in AI
The phrase “human in the loop” technically refers to human supervision in AI training. But in the film, this idea expands into something philosophical: humans are never outside the system. They design it, feed it data, correct it, and silently maintain it.
At first, AI appears powerful and efficient, trained on massive datasets. But the film gradually reveals that these datasets are not universal they reflect dominant social perspectives. What is included in data, and what is excluded, becomes a political decision.
From a film studies perspective, this connects to the concept of representation. Just as cinema does not show reality directly but constructs it through framing and editing, AI constructs knowledge through classification and filtering. Both systems select, organize, and interpret reality rather than simply reflecting it.
Visually, the film reinforces this idea by showing workers labeling images and moderating online content. These workers, often from economically vulnerable backgrounds, are responsible for teaching machines how to “see” and “understand.” Yet they remain invisible in the global narrative of technological innovation. This contrast is powerful—it exposes the gap between those who build AI and those who profit from it.
2. Algorithmic Bias as a Social Problem
One of the film’s strongest arguments is that bias in AI is not just a technical glitch. It is rooted in culture.
When AI systems misidentify faces or reinforce stereotypes, the problem does not lie only in coding errors. It lies in the data used for training—data that may overrepresent certain groups while ignoring others. The film shows that algorithms learn from society, and if society is unequal, the system will reproduce that inequality.
This connects to the idea of ideology. In cultural theory, ideology shapes how reality is understood. It influences which voices are centered and which are marginalized. Similarly, AI systems reflect the dominant values embedded in their data sources.
The film contrasts glossy images of advanced technology with intimate scenes of human labor. This editing choice highlights the contradiction between corporate claims of neutrality and the hidden human effort behind AI systems. The message becomes clear: technology carries the imprint of social power.
3. Whose Knowledge Is Valued?
Another important theme in the film is epistemic hierarchy—the idea that some forms of knowledge are treated as more important than others.
In technological environments, measurable data and statistical outputs are considered legitimate knowledge. However, local experiences, cultural understanding, and emotional intelligence are often ignored. The film shows how data annotators correct AI errors, yet their insights are rarely acknowledged publicly.
This reveals a deeper imbalance. Those at the top developers and corporate leaders—are celebrated as innovators. Meanwhile, the workers who train and refine the system remain invisible. Their knowledge is necessary but undervalued.
By focusing on these workers, the film challenges dominant narratives about technological progress. It shifts our attention from machines to humans, reminding us that intelligence artificial or otherwise—depends on human contribution.
4. Technological Systems as Apparatus
Film theory teaches us that cinema operates through an apparatus—a structure that shapes perception. Cameras, editing, and projection systems influence how audiences interpret reality.
Similarly, AI functions through its own apparatus: algorithms, datasets, interfaces, and platforms. These structures determine what is visible and what is ignored.
The film cleverly reflects this idea by showing screens within screens. Workers stare at digital interfaces while algorithms process images in the background. This layered visual design reminds viewers that all knowledge is mediated. There is no pure, untouched reality—only interpretations shaped by systems.
By drawing parallels between cinema and AI, the film suggests that both are cultural technologies. Both shape perception. Both carry ideology.
5. Representation and Global Inequality
Representation is not only a cinematic concern—it is also central to AI. When certain communities are underrepresented in datasets, they are misrepresented by machines. This can lead to discrimination and exclusion in real life.
The film also highlights global inequality. Workers from economically disadvantaged regions train AI systems used by powerful corporations. This reveals a digital form of exploitation, where labor from the margins supports technological advancement at the center.
The imbalance is not only economic but intellectual. Those who control algorithms shape definitions of truth and knowledge. This reflects the broader connection between power and knowledge—those who define categories often define reality itself.
6. Human Responsibility and Possibility of Change
Despite its critical tone, the film does not portray humans as helpless. Instead, it emphasizes responsibility.
The concept of “human in the loop” becomes symbolic. It suggests that ethical intervention is still possible. Workers who question guidelines or notice unfair classifications demonstrate that resistance exists within the system.
This idea is hopeful. It reminds us that technology is not destiny it is shaped by choices. If bias is socially produced, it can also be socially challenged.
Conclusion: Rethinking Progress
Humans in the Loop ultimately encourages us to rethink the meaning of technological progress. AI is not an independent force moving beyond humanity. It is deeply connected to human labor, cultural values, and power structures.
The film serves as a form of cinematic resistance. By revealing hidden labor and questioning neutrality, it invites viewers to become more critical digital citizens. It reminds us that behind every algorithm lies a human decision.
For me, this screening was not just an academic task. It changed the way I see technology. I now understand that innovation must be examined ethically and socially, not only technically. Cinema, like AI, shapes how we understand the world—and both demand critical awareness.
Task 2 — Labour & the Politics of Cinematic Visibility
Invisible Hands, Visible Screens: Labour in Humans in the Loop
Introduction: Rethinking Automation and Human Presence
In popular imagination, Artificial Intelligence is often associated with speed, autonomy, and automation. AI systems are portrayed as machines that function independently, minimizing the need for human involvement. Humans in the Loop disrupts this narrative by uncovering the hidden human labour that quietly sustains these systems. Rather than focusing on advanced technology alone, the film brings attention to data annotators individuals who label images, filter content, and train algorithms, yet remain absent from mainstream discussions of innovation.
Through careful cinematic choices, the film critiques the conditions of labour under digital capitalism. Drawing upon ideas from Marxist film theory, cultural studies, and representation theory, this reflection examines how the film makes invisible labour visible, questions its social value, and exposes the unequal power structures embedded in technological production.
1. Seeing the Unseen: Cinematic Focus on Digital Labour
One of the most striking aspects of the film is its deliberate focus on labour that usually remains unseen. Data annotation, a task typically confined to private screens and isolated rooms, is transformed into a central cinematic subject.
The camera frequently settles on prolonged shots of workers sitting in front of computers. The glow of the screen dominates their faces, symbolizing both participation in a global digital system and personal isolation. The repetition of gestures clicking, scrolling, tagging creates a visual rhythm that emphasizes monotony. Unlike traditional representations of labour involving physical movement, this work appears static, yet mentally draining.
From a Marxist perspective, this repetition reflects alienated labour. Workers perform fragmented tasks without access to the final outcome of their effort. Their contribution is absorbed into large technological systems where ownership and recognition are absent. The film’s slow pacing allows viewers to experience this sense of temporal exhaustion, mirroring the workers’ own reality.
Sound design also plays a crucial role. The absence of dramatic music and the prominence of mechanical sounds—keyboard taps, mouse clicks—underscore the silence surrounding this labour. By avoiding spectacle, the film makes a political choice: it grants seriousness to work that society often overlooks.
2. Emotional Labour Behind the Screen
The film does not limit its exploration to physical routines; it also captures the emotional strain involved in digital labour. Many data annotators are tasked with reviewing disturbing or sensitive content, yet their psychological well-being is rarely considered within corporate frameworks.
Instead of exaggerated emotional scenes, the film adopts restraint. Subtle expressions of fatigue, discomfort, and detachment convey the emotional burden of the work. Interviews reveal strict productivity targets and constant monitoring, reinforcing how digital capitalism prioritizes efficiency over mental health.
This reflects the commodification of emotional labour. Workers are expected to suppress personal reactions while engaging with harmful material. Their emotions, like their time, become resources to be managed. Cultural Marxist theory helps us understand this process: capitalism does not only extract physical labour—it also exploits attention, emotion, and psychological endurance.
Visually, workers are often framed within enclosed domestic or workspaces. These settings suggest how digital labour collapses the boundary between private and professional life. The global digital economy enters personal spaces, extending control beyond traditional workplaces.
3. Hierarchies of Labour and Social Recognition
A recurring concern in the film is the unequal recognition of labour. While software engineers and corporate leaders are celebrated as the faces of innovation, data annotators remain anonymous and underpaid.
This reflects a clear class division within technological production. From a Marxist standpoint, those who own and design systems gain economic and symbolic power, while those performing repetitive tasks are pushed to the margins. The film exposes this imbalance by contrasting corporate language about progress with the everyday realities of workers.
Identity also plays a crucial role. Many of the workers portrayed belong to economically vulnerable communities, often from the Global South. Their labour is essential to global AI systems, yet their identities are rendered invisible. This selective visibility reinforces existing global inequalities.
Culturally, society tends to value intellectual and creative labour over repetitive or service-oriented work. By centering data annotators, the film challenges this hierarchy. The simple act of giving these workers screen time becomes a political intervention—an attempt to restore dignity to labour that has been culturally devalued.
4. Digital Capitalism and Fragmented Work
The film also reveals how digital capitalism restructures labour itself. Work is divided into small, disconnected tasks distributed across platforms and time zones. Workers rarely interact with one another, reducing collective awareness and solidarity.
This fragmentation is reflected in the film’s editing. Rapid shifts between locations, screens, and individuals create a sense of disconnection. The narrative does not follow a single protagonist but instead mirrors the scattered nature of digital labour networks.
From a Marxist film theory perspective, this fragmented structure exposes how labour becomes abstract. Human effort is converted into data, stripped of identity, and absorbed into seamless interfaces. Labour becomes invisible precisely because it is hidden behind the smooth surface of technology.
The film also dismantles the myth of full automation. While corporations promote AI as replacing human effort, the film reveals that automation relies on continuous human input. This contradiction exposes how technological narratives often mask exploitation.
5. From Visibility to Critical Awareness
The film invites viewers to engage on multiple levels. First, it fosters empathy by humanizing workers whose labour is usually reduced to statistics. Close framing, personal testimonies, and everyday details encourage emotional connection.
However, the film does not stop at empathy. It pushes toward critique. By revealing the gap between corporate narratives and lived experience, it asks viewers to question the economic and ideological systems behind technological development.
Most importantly, the film gestures toward transformation. By making invisible labour visible, it challenges dominant ideas about intelligence, automation, and progress. Visibility becomes political—what is seen gains value, recognition, and legitimacy.
From a cultural theory perspective, representation shapes understanding. By foregrounding marginalized labour, the film contributes to a broader rethinking of technology as a human, ethical, and social process rather than a purely mechanical one.
Conclusion
Humans in the Loop reframes Artificial Intelligence not as an autonomous force but as a system sustained by human effort. Through its cinematic strategies, the film exposes the labour, emotion, and inequality hidden beneath digital interfaces. It reminds viewers that behind every “smart” system are human hands—often unseen, undervalued, yet indispensable.
For me, this film reshaped how I understand both cinema and technology. It demonstrated how films can function as tools of resistance by revealing what power prefers to keep invisible. In doing so, it urges us to reconsider whose labour we recognize and whose we ignore in the digital age.
Task 3 — Film Form, Structure, and Digital Meaning
Nature, Screens, and Systems of Control in Humans in the Loop
Introduction: When Form Thinks for the Film
Cinema is not merely a vehicle for information; it is a way of thinking. In Humans in the Loop, the film’s critique of digital culture and artificial intelligence is embedded as much in its visual and structural design as in its spoken content. The documentary does not simply explain how human labour supports AI systems—it enacts this reality through its cinematic form.
By reading the film through the lenses of Structuralism, Film Semiotics, and Formalist narrative theory, we can see how images, sounds, and editing patterns function as arguments. The recurring contrast between natural environments and screen-dominated interiors becomes a visual language that articulates concerns about identity, labour, and technological abstraction.
1. Structuralism and Film Semiotics: Meaning Through Opposition
Structuralist theory proposes that meaning arises from relationships rather than isolated elements. In cinematic terms, this means that images and sounds gain significance through contrast and repetition.
The film is built around two dominant visual systems:
Open, natural environments
Enclosed, digitally mediated workspaces
These systems function as binary oppositions, a key concept in structuralist thought. Natural spaces are associated with openness, unpredictability, and lived human experience. Digital interiors, on the other hand, signify regulation, categorization, and control.
By repeatedly cutting from wide outdoor shots to cramped interiors filled with screens, the film establishes a symbolic tension. Human life appears expansive and fluid, while digital systems compress that complexity into data points. Through this opposition, the documentary suggests that algorithmic systems can never fully contain human reality.
2. Nature and Screens: Visualizing Philosophical Conflict
The contrast between natural imagery and digital environments is not decorative it is thematic.
Outdoor scenes often unfold in long takes, with soft light and ambient sound. These moments feel unhurried, allowing human presence to exist without measurement. Nature here symbolizes subjectivity, emotion, and continuity.
In contrast, scenes inside digital workplaces are visually restricted. Frames are tight, lighting is artificial, and screens dominate the composition. Movement is minimal, reinforcing a sense of confinement.
From a semiotic perspective, these choices encode meaning:
Nature signifies embodied, experiential knowledge
Digital interfaces represent abstract, processed knowledge
The film rarely blends these spaces harmoniously. Instead, it places them in friction, implying that digital systems reshape—not simply reflect—human perception and experience.
3. Framing Labour: Camera as Ideological Tool
Formalist analysis draws attention to how technique influences interpretation. In this film, camera framing plays a decisive role in shaping our understanding of digital labour.
Close-up shots isolate workers within the frame, often lit by the glow of screens. This visual strategy personalizes labour while simultaneously showing how identity is filtered through technology. Facial expressions—fatigue, concentration, hesitation—become sites of meaning.
The frequent use of static shots further intensifies this effect. The camera’s stillness mirrors the repetitive nature of data-labelling work. Time feels slowed, forcing viewers to experience duration rather than efficiency. This creates empathy while also exposing alienation.
4. Editing and Structure: Fragmentation as Ideology
The film’s editing style reflects the logic of digital systems themselves. Short, segmented sequences screen activity, interviews, environmental shots are assembled into a fragmented narrative.
This structure echoes the organization of digital labour: divided into microtasks, distributed across workers, yet unified by algorithms. From a structuralist viewpoint, the form itself communicates ideology. Fragmentation becomes a cinematic representation of digital capitalism.
At key moments, however, the film resists this rhythm. Extended shots interrupt the fragmented flow, allowing human presence to exist outside algorithmic tempo. These moments feel quietly oppositional, emphasizing the difference between machine time and human time.
5. Sound Design: The Quiet Machinery of Digital Work
Sound plays a subtle but crucial role in shaping meaning. The film avoids heavy background music, allowing silence to dominate many scenes. This absence creates an atmosphere of emotional flatness and repetition.
When sound does appear, it is often diegetic: keyboard taps, mouse clicks, notification tones. These noises are amplified, forming a mechanical rhythm. Although the setting is quiet, the soundscape recalls industrial labour, suggesting that digital work is a contemporary form of factory production.
Natural sounds wind, birds, distant voices stand in stark contrast to these clicks. This auditory opposition reinforces the broader tension between organic life and technological systems.
6. Experiencing Labour Through Form
The documentary does not simply describe digital labour; it makes the audience feel it.
Slow pacing encourages contemplation
Repetitive visuals emphasize monotony
Confined framing generates emotional closeness and restriction
Juxtaposition produces philosophical unease
According to formalist theory, meaning is inseparable from form. Here, cinematic techniques embody the film’s critique. Viewers do not merely understand digital alienation intellectually they experience it sensorially.
Workers are presented as both individuals and data processors. The alternation between personal testimony and abstract screen imagery captures this dual existence, revealing how identity is reshaped within algorithmic systems.
7. Human–AI Relations as Structural Design
AI in the film is never personified. Instead, it appears as an invisible framework—interfaces, rules, metrics, and workflows. This absence is deliberate.
Rather than focusing on machines as characters, the film foregrounds systems. Its narrative structure mirrors algorithmic logic: input, processing, output. Human labour becomes the connective tissue within this cycle.
This approach aligns with structuralist thought, where systems—not individuals—produce meaning. The film itself becomes a model of the technological processes it critiques.
Conclusion: Form as Critique
Taken together, Humans in the Loop functions not only as a documentary about artificial intelligence but as a philosophical examination of digital culture.
Earlier tasks revealed how the film challenges the myth of technological neutrality by exposing bias and power embedded within AI systems. It shows that algorithms are shaped by social hierarchies rather than objective intelligence.
The film also foregrounds invisible labour, revealing how digital capitalism depends on human effort while erasing it from public narratives. By centering marginalized workers, it disrupts celebratory discourses of automation.
Finally, through its formal design visual oppositions, controlled framing, fragmented editing, and restrained sound the film transforms cinema into critique. Its structure mirrors the very systems it interrogates, making form itself a mode of political and philosophical expression.
Reference:
Alonso, D. V. (2026). Imagining AI futures in mainstream cinema: Socio-technical narratives and social imaginaries. AI & Society. https://doi.org/10.1007/s00146-026-02880-7 Anjum, N. (2026) Aranya
Sahay’s Humans in the Loop and the politics of AI data labelling. The Federal. https://thefederal.com/films/aranya-sahay-humans-in-the-loop-oscar-adivasidata-labelling-jharkhand-ai-tribal-216946
Apparatus: Film, Media and Digital Cultures of Central and Eastern Europe (ongoing academic journal). (n.d.). Retrieved February 15, 2026, from https://en.wikipedia.org/wiki/Apparatus_(journal)
Barad, D. (2026, January). Humans in the loop: Exploring AI, labour and digital culture [Blog post]. https://blog.dilipbarad.com/2026/01/humans-in-loop-film-review-exploring



