A Simple Social Media Experiment

The gaze—this focused, eager, or curious way of seeing—has always given power to the viewer. Just like the invention of perspective in painting changed how images were created for the eye, social media feeds are now designed to pull our attention inward, making us feel like everything is arranged for us to look at.

Dalí’s work with the “six virtual corneas” shows                                     many reflections all pointing back to one person,which is very similar to how our screens                                    work today. Everything seems to come toward us. Our phone becomes a small hole through which the whole world arrives, and inside that hole, we become the center. It’s private—we hold it in our hands, close to our body—but at the same time, we are never really alone, because others are always watching, posting, reacting, and appearing on the other side.

This is why I began with the gaze: it helps show how social media and our phones create this strange situation where we are both completely by ourselves and constantly surrounded.

Humans have become both the subject and the object, while machines—algorithms—are now shaping and enforcing our behavior. What we do online is constantly monitored, predicted, and nudged, turning our actions into data that controls how we experience the digital world. In this system, visibility and participation are no longer just our choice—they are guided, limited, and sometimes even forced by the very technology we created.

For Foucault, the feeling of always being seen—even when we don’t notice it—changes how we behave. Social media takes this even further: we choose to be visible, we perform for visibility, and visibility becomes a kind of pressure. The TikTok moment shows how easily the trap appears the second another gaze enters the scene.

Building on this, the Orb by World makes the same point in a more personal and invasive way. The Orb is a biometric device that scans a person’s iris and face to generate a unique “World ID,” verifying them as a real human. Unlike code or algorithms alone, the Orb is a physical object that sees us, turning our bodies into data. Just like the Autonomy Cube reveals the hidden infrastructure of the Internet, the Orb exposes how emerging technologies are not just abstract systems—they are tangible devices controlling, verifying, and mediating our presence in the digital world. 

 

And also  now such devices with that data created from our flesh, now confirms if we are human. Aren't the machines looking back at us now?

 

Trevor Paglen's ImageNet Roulette

Social Experiment (Contd)

This aligns with a posthuman perspective: as N. Katherine Hayles writes in How We Became Posthuman, “The posthuman view privileges informational pattern over material instantiation.” We are no longer just users of technology; we are already hybrids of flesh and code, co-produced by the systems we interact with. Our digital identities, behaviors, and even social interactions are inseparable from the infrastructures—both physical and algorithmic—that mediate them.

In this sense, CAPTCHAs, Instagram verifications, the Autonomy Cube, and the Orb are all part of a continuum. They reveal how our bodies, actions, and presence are folded into informational patterns that machines can read, respond to, and even control. We exist not as purely human agents in a digital world, but as material-informational beings, constantly negotiated between flesh and algorithm.

From there I started thinking about how for older generations, photographs were physical objects; they were real and limited and unchanging. They lived in albums and boxes and frames. Looking at them meant taking a pause, having a quiet moment of contemplation. Holding, say, a framed photo of someone's grandparents makes you look closely at the details, imagine their lives, feel a real connection to the past.

Unlike digital images, which we scroll through quickly and endlessly, these tangible photos make the memory tangible. Seeing a photo once meant paying attention and being present. A frame isn't just something to hold a picture in place; it's a space for thinking, remembering, and feeling near history.

This idea of the “Social Media Reflex” also connects to Amalia Ulman’s Excellences & Perfections (2014). Ulman spent four months carefully curating an Instagram profile that documented the life of a wannabe it-girl in LA—complete with staged events like a (fake) boob job and public apology. Nearly 90,000 followers became invested in her story, only to discover it was entirely a hoax.

 

Her project highlights how we participate in the construction of online identities, responding to cues, narratives, and aesthetics, often without realizing it. Just as I noticed patterns in my own posts—mirror selfies versus regular selfies, the unconscious adjustments of the Social Media Reflex—Ulman’s experiment shows how social media can shape both the way we present ourselves and the way we perceive others, blurring the line between reality and performance.

Instagram Face or

Tiktok body?

The Body I Post

"Visibility is a trap"

- Michel Foucault

And towards the end I started getting a bit experimental with this approach of data, memory & materiality where I drew the only memory I have in my brain of someone close who isn’t alive anymore.Who had no digital trace.
Then I erased the person out it and photographed the sketch. I didn’t post it

 

Now I just wonder the vulnerable state of it being this close to be forever included into this data driven world.

This research examines the intersection between social media, surveillance, and emerging technologies like AI that work together and inform contemporary experiences about the body. By tracing various artworks, digital interventions, social experiments, and personal archives through connections, the project examines how these systems collectively influence visibility, self-presentation, and identity in the digital age.

Our digital archives
“produce memory as flow rather than stock.”
-José van Dijck

Completely Automated Public Turing test to tell Computers and Humans Apart

@trulyauthentic.exe

@trulyauthentic.exe

@glitchinginrealtime

Thank you.

C A P T C H A

The body is in a constant loop of self-presentation.
Posing is self-imposed, guided by metrics and feedback.
Beauty standards are co-created by algorithms and globalized instantly.
Performance is continuous, habitual, and data-driven.

 

And  once we’ve posted on social media, it’s not just “out there” — it becomes part of a permanent archive. Social media platforms like Facebook, Instagram, and even YouTube, hold all the content we’ve shared over time, making it accessible to us whenever we want to revisit it.

 

Features like “On This Day” in photos show how algorithms stretch time, highlight certain moments, and recontextualize them, constantly reshaping the narrative of who we are.

@glitchinginrealtime

Andy Picci’s video Let Me Out makes this idea even clearer. In the work, Picci’s face becomes stuck inside a filter—trapped inside his own digital image. He struggles to escape the version of himself created by social media.  Once our bodies enter these systems of constant visibility, it becomes hard to tell where our real selves end and the performed, filtered versions begin.

Face to Facebook (2011)

Face to Facebook pushes the idea of visibility even further. By stealing one million Facebook profiles and reorganizing them through facial-recognition software, the artists showed how easily our online identities can be taken, sorted, and reused without our control. What was once shared for friends suddenly became part of a dating site made by strangers.

One woman whose photo appeared on the fake dating site even began receiving unwanted emails from men she had never met. Her private image, taken from her own profile, had been turned into an invitation she never wrote.

The strong reactions—lawsuits, threats, and media outrage—show how uncomfortable people become when they lose ownership of their own face. The project reveals a central point of this research: once our images enter digital systems, visibility is no longer just something we choose—it becomes something that can easily happen to us.

Inspired by these ideas, I conducted a small experiment on Instagram. I created two accounts: one linked with my real ID and the other that had nothing to do with me. Right away, I noticed a large difference. The first account (@trulyauthentic.exe) worked smoothly: posting, scrolling, and interacting was easy.

For the second (@glitchinginrealtime), many checks lay in the way. I had to prove that I am human with CAPTCHAs, share my phone number, enter OTPs, and even submit a video of my face.

What began as a simple act of observing my behavior soon developed into a more profound introspection. Technology that was supposed to make things easier now felt controlling. Features that used to be convenient, like automatic logins or friend suggestions, created barriers for people without established digital identities.

That made an important point: the systems we built for connection and convenience can also limit us. Online, our visibility and activity are not just about choice; they are controlled by rules and algorithms. What used to feel easy now comes with many checks, showing how our digital presence is managed in ways we don’t always notice.

Further extending into the working of this data creating technology, Trevor Paglen’s ImageNet Roulette (with Kate Crawford),exposes biases embedded in AI training datasets. As Paglen notes, “We want to show how layers of bias and racism and misogyny move from one system to the next… to let people see the work that is being done behind the scenes, to see how we are being processed and categorized all the time.”

Continuing my social media experiment, I started noticing patterns in the posts I made for both accounts. The kinds of photos I chose, the tone of the captions, even the way I framed myself began to shift depending on which account I was using. This made me look back through my own archives, trying to see if there was a particular pattern I kept repeating—certain poses, styles, or moods that seemed to “fit” each identity.

This project reminds us that we are not neutral users of technology. From our flesh to our actions, from images to behaviors, machines process, classify, and even judge us—often in ways shaped by hidden biases. Together, these examples reveal how digital systems not only mediate our presence but also encode social and cultural assumptions, showing how deeply intertwined human identity and algorithmic power have become.

A photoframe of My Grandparents

Who Controls Whom Now?

Trevor Paglen (with Jacob Appelbaum) created a clear Plexiglas cube that contains computer hardware. It acts as a public Wi‑Fi hotspot, but instead of a normal connection, it routes all traffic through the Tor network, anonymizing users. 

For me though it makes me focus on the physicality of the system—the actual hardware, the wires, and the machines behind everything we take for granted online. It’s a reminder that behind every smooth digital interaction, there is a tangible, complex system at work, quietly shaping and enabling our experiences. And these are the similar systems, confirming our identities

A video of couple  recording a video of a few tiktokers who are recording themselves for another audience. Then are seen shying away once they realize they are being watched.

And further I wanted to dig deeper into the materiality aspect of it and understand how it affects the narrative through my personal archive, and just to juxtapose I chose photos of my Grandparents again from a trip few months back where they revisit their childhood homes. 

 

Here I am interested in how these digital pictures now will set the narrative for our future viewings. Like for eg the screenshot I put up below is from our family chat where the comments are already discussing the future narratives of them.

This means we don’t just show our bodies —
we train them for visibility.

Looking closer at the posts, I noticed something subtle but consistent. In the four images I examined, my mirror selfies all had me looking down, while in actual selfies my head tilted slightly up. This small difference made me realize how my body responds differently depending on context, audience, or platform—a kind of unconscious adjustment. I’ve started calling this the “Social Media Reflex”: the instinctive ways we position, frame, or perform ourselves online, often without realizing it, shaped by the imagined gaze of others and the systems we post into.

Upon further thought, I was more interested in understanding the materiality and it's effect of a photo in those personal archives. So I pulled up these photos clicked recently from their revisit to their childhood homes. The way how these digital images will be consumed here on is something which really interests me.