This Controversial Artist Matches Influencer Photos With Surveillance Footage
‘The Followers’ uses artificial intelligence and facial-recognition technology to comment on the surveillance state
It’s an increasingly common sight on vacation, particularly in tourist destinations: An influencer sets up in front of a popular local landmark, sometimes even using props (coffee, beer, pets) or changing outfits, as a photographer or self-timed camera snaps away. Others are milling around, sometimes watching. But often, unbeknownst to everyone involved, another device is also recording the scene: a surveillance camera.
Belgian artist Dries Depoorter is exploring this dynamic in his controversial new online exhibit, The Followers, which he unveiled last week. The art project places static Instagram images side-by-side with video from surveillance cameras, which recorded footage of the photoshoot in question.
September 12, 2022
On its face, The Followers is an attempt, like many other studies, art projects and documentaries in recent years, to expose the staged, often unattainable ideals shown in many Instagram and influencer photos posted online. But The Followers also tells a darker story: one of increasingly worrisome privacy concerns amid an ever-growing network of surveillance technology in public spaces. And the project, as well as the techniques used to create it, has sparked both ethical and legal controversy.
To make The Followers, Depoorter started with EarthCam, a network of publicly accessible webcams around the world, to record a month’s worth of footage in tourist attractions like New York City’s Times Square and Dublin’s Temple Bar Pub. Then he enlisted an artificial intelligence (A.I.) bot, which scraped public Instagram photos taken in those locations, and facial-recognition software, which paired the Instagram images with the real-time surveillance footage.
Depoorter calls himself a “surveillance artist,” and this isn’t his first project using open-source webcam footage or A.I. Last year, for a project called The Flemish Scrollers, he paired livestream video of Belgian government proceedings with an A.I. bot he built to determine how often lawmakers were scrolling on their phones during official meetings.
“The idea [for The Followers] popped in my head when I watched an open camera and someone was taking pictures for like 30 minutes,” Depoorter tells Vice’s Samantha Cole. He wondered if he’d be able to find that person on Instagram.
Public reaction to the project has been mixed; some have praised Depoorter for drawing attention to the modern surveillance state, while others have criticized what they see as a flippant use of potentially harmful technology: showing how easy it is to access livestream footage and facial-recognition software. Many of these critics encouraged the artist never to make the A.I. he developed public.
Please don't ever release this, make it publicly available or sell it to someone who doesn't need it.— Josh W (@welfordian) September 12, 2022
“Art does many great things, including stir generative discussions and debate about life as we know it,” Francesca Sobande, a digital media scholar at Cardiff University, tells Input’s Chris Stokel-Walker. “However, art projects can also have harmful effects. Such harms should not be brushed aside in discussions about art and the technology that is sometimes central to it.”
Depoorter tells Hyperallergic’s Rhea Nayyar that he won’t be releasing the software. Still, he says, “I’m only one person. I have limited access to data, cameras … Governments can take this to another level.”
The Followers has also hit some legal snags since going live. The project was originally up on YouTube, but EarthCam filed a copyright claim, and the piece has since been taken down. Depoorter tells Hyperallergic that he’s attempting to resolve the claim and get the videos re-uploaded. (The project is still available to view on the official website and the artist’s Twitter).
Depoorter hasn’t replied directly to much of the criticism, but he tells Input he wants the art to speak for itself. “I know which questions it raises, this kind of project,” he says. “But I don’t answer the question itself. I don’t want to put a lesson into the world. I just want to show the dangers of new technologies.”