16 June 2015
Do we know where our images end up online? Storyteller & Technologist Natalie Kane explores the digital ghosts that haunt the web.
The photographs we take of ourselves, or have taken of us, are the visual record of our existence in the world. However, we are finding that occasionally they will go on to live lives without us, in darker, or more abstracted places than we initially intended. Our images become involuntary nomads, displaced across the internet to places we don’t predict, permit, or have knowledge of.
I remember last year the proposed change to Instagram’s terms and services, where the images that you took would automatically become the companies property, allowing them to sell them on to third-parties for use in advertising. The public backlash was enormous, with many leaving the service to rival image-sharing websites, justifiably upset at this abuse of their images. The main fear, aside from the breach of intellectual property rights, was that our faces, our friends, our memories, would be associated with something that we hadn’t consented to. They would be displaced from our own, mediated context, into another world with its own environment, all out of our control.
We don’t really know where our images are going when we take them, often we don’t think that far. We don’t see the extent to which our image can be abstracted until we find them in the wrong place. When we take photographs of ourselves, or allow for a photograph of ourselves to be taken, we don’t imagine it will be transferred into a place it doesn’t belong, seen by those we never imagined to see it. If we live online, and openly so, then this is where the others live – the replicated ghosts of ourselves, with new meanings, new territories, and new troubles.
James Bridle recently wrote on the phenomena of ‘Render Ghosts’ for Electronic Voice Phenomena, the people that exist on the giant, glossy billboards encasing development sites. These stock image renderings are the pixellated humans used by architects to explain how the spaces will look when there are people in them. As James explains, Render Ghosts ‘live inside our imaginations, in the liminal space between the present and the future, the real and the virtual, the physical and the digital.’ They are living the lives that are predicted for us by the plans of the buildings we construct, permanently maintaining pixellated grins.
As Bridle says, we do not know who these people are, where and when they surrendered their image, or why. Did they know they would become part of an idealised city? We can only guess that they were once part of a stock image or brochure, eventually taken out of context for this purpose and transplanted across the world into buildings, plazas, parks. In the summer of 2013, James went hunting for these ghosts, to find out what it feels like to be part of a network, ‘endlessly reproduced, endlessly pixelated.’ He found nothing except an empty house, and ended up in the desert of Albuquerque. Only the images remain, out of context, and out of mind.
We may never be sure where the photographs we upload end up, but we certainly don’t anticipate images we didn’t know existed in the first place. In our work at Lighthouse, we have acted as executive producer for the latest round of BFI shorts, a scheme supporting the production of new work by emerging talent. One of these shorts has become particularly relevant to our work exploring the world post-PRISM, looking at the hidden networks that exist just below the surface. In Stephen Fingleton’s short film SLR (to be released online in January 2014), we encounter the world of voyeur pornography, an online community orchestrated by those who take explicit photographs of women when they least expect it. The protagonist is seen to harvest hundreds of images from a forum, lurking on the periphery, until he is forced to confront the network head on when a particular set of images are uploaded. I won’t spoil it for you, but the fast acceleration between passively, and actively participating in a network of exploitation is chilling, and timely.
Similar to the images obtained through RAT (Remote Access Tools) networks, the digital ghosts present in SLR are completely ignorant of any image existing until it is exposed, as evident in the case of Miss Teen USA. This is where the hidden network they belong to becomes visible. These women aren’t looking for images of themselves, and most image search algorithms aren’t sophisticated enough to identify a person from various angles without a pre-existing, similar image to work from. When we don’t think we’re being watched, we turn to a different public performativity, one we aren’t used to identifying in the images we self-curate and place online. Here we are vulnerable, particularly as women, without the tools or means to protect ourselves.
Earlier this year, we learned of one of the unseen, or at least most unprepared, problems of sharing images of ourselves online. Student Retaeh Parsons, after months of bullying on and offline, eventually took her own life. Her image was circulated across media channels, blogs, and social media sites, where it was eventually collected into the data banks of an image scraping algorithm. Her photos appeared months later in a Canadian dating advert on Facebook. Her family and those that recognised her, were horrified, and rightly so. When things like this happen, we imagine there is something, or someone, to prevent this behaviour, we don’t anticipate that this decision was governed by an algorithm operating blindly, instructed to gather images of women from a certain, specific demographic.
We’re already feeling the impact of the thousands of ‘ghost’ profiles that exist online, unable to be taken down by anyone other than the user. Every year I am reminded of a friend’s death by Facebook’s cheery, unknowing, suggestion to wish them a Happy Birthday. Many of our friends still do. We collectively experience the ghost in the machine language; indirectly encountering and interacting with the social network activity of the dead. ‘X also likes this’, when in fact the statement belongs firmly in the past tense.
Interestingly, Andrew Ennals, the man that discovered the ads, blamed a flaw in the algorithm for Retaeh’s appearance in the Canadian dating ads, rather than the motivations behind its use. As you may, or may not know, Facebook actively use profile pictures ‘"in connection with commercial, sponsored, or related content (such as a brand you like) served or enhanced by us."’ The company responsible for the ad featuring Retaeh, ionechat.com, are yet to comment.
Algorithms do not know the context of a photograph, they don’t understand, or pre-empt the consequences of their own function. They do not have our faulty methodology, the algorithm is blameless; it is us as creators who are essentially at fault. As Andrew Sliwinski said, ‘Algorithms are not created without a point of view. They carry the perspectives and motivations of their authors.’ This isn’t to say that the creators of this algorithm planned for, or knew that one day their code would throw up the image of a dead girl, however it’s probably fair to say that they didn’t predict this, or look that far ahead.
We also can’t tell where the code originated, or how innocently it was created. At best, we can guess that the code for pulling the profile pictures of women and repurposing them was once the solution for a unknown, local problem by a developer. The fact that it has gained a darker, exploitative purpose was unpredicted, but inevitable. The algorithm is quickly becoming an appropriated technology, reaching a point where solutionist thinking in programming is failing. This is not an excuse to stop creating new things, however, we are now more sure than ever that our bodies leave ghosts.
This article originally featured on Futures Exchange, a collection on Medium, edited by Frank Swain.
Join our e-mailing list and be kept up to date with all of our news & activity.