'The Algorithms are listening, we need to listen back.'
31 July 2014
On Tuesday 15th July, Programme Assistant Natalie Kane spoke at South London Gallery for WYSIWYG?, an evening of talks which asked what will happen to art in the digital age.
As we move forwards as arts organisations, with the digital, we are progressively surrendering our cultural stuff to computational systems, and as a result I’ve been looking at the breaks, the oddities, and the weak signals of digitally mediated culture, particular through the use of algorithms, to see what we need to watch, or learn from.
Over the last few years or so we have become more and more aware of the role that algorithms are playing in our understanding, absorption and filtering of culture. We don’t often know what algorithms really are, as the joke goes, algorithms is the word for something an engineer can’t be bothered to explain. An algorithm is a black box, and we aren’t privy to its workings. We can only see what goes in and goes out but never what is inside.
In the last few years, algorithms have had a significant impact on the way we experience culture online. From Spotify suggestions to Amazon recommendations, Twitter feeds to social media timelines, algorithms mediate how we interact with the world around us. As Christopher Steiner says ‘We are not always shaping the algorithms, they are shaping us. They shape our culture, they shape the way we see, they shape what we hear, they shape how we live.’ The culture we experience is mediated by the cumulative and the collective, often what we see is what we get. They shape us by allowing us to see ourselves, not as we are, but who the algorithm thinks we are, which then in turn influences and contributes our culture, because the algorithm is now, whether you like it or not, part of culture.
So, if algorithms are ‘an invisible architecture that underpins almost everything that is happening’, as Kevin Slavin observes, what does this mean for curators and arts institutions in the age of the algorithm? If the algorithm is always listening, and responding, what does that mean for us as the supposed filter of culture?
I’d like to draw your attention to the algorithm as curator.
Here is the latest exhibit at the 9/11 museum in New York. Overwhelmed by responsibility, museum director Alice Greenwald elected to have an algorithm curate the exhibition, scraping news stories and content from across the globe to display in a multimedia, digital exhibit.
Why? In an interview with NPR’s Aarti Shahani, Alice Greenwald said that ‘she couldn’t entrust a single person to hand-select a few moments and stick together a story. An algorithm, on the other hand, finds correlation.’
The algorithm is logical, and objective, and clean, and only deals with truths. This is the assumption we make of the algorithm, an assumption that is true, because that is how algorithms are designed to work. Having to deal with difficult subject matter shouldn’t mean that we have to find the most logical way of dealing with it (we are human after all) and although this showed a collective account of what happened, it loses its emotional weight in its quest for objectivity.
However, this objectivity isn’t exactly the case in this particular exhibit. As Andrew Sliwinski mentioned, on Twitter, ‘Algorithms are not created without a point of view. They carry the perspectives and motivations of their authors’ – in this case, the motivations of their curator. The data set that was fed in to create this exhibit was dictated, and curated, by a human, with their faulty methodology and bias. That the articles and reactions thrown up were almost entirely in English will tell you that this came from an arguably narrow experience. This is not an objective record of events, this is a record of a specific experience, by a specific group of people. Stepping away from responsibility by applying technology supposedly removes your accountability, and stops any blame from finding you. Therefore, when any breaks do happen, they are not, and cannot be your fault, they are that of the algorithm and the engineer. Of course, this is not the case, as the algorithms, and the engineer, are just doing what you’ve told them to.
One of the issues that artists and institutions wrestle with is the alleged ‘tyranny’ of the curator, the omnipotent, omniscient rule of The Cultural Dictator. Of course, I’m being hyperbolic here, but it is a conversation that has run throughout art for centuries, and is still important to consider. So how can artists use digital platforms and new technologies to circumvent that? Introducing Cameron MacLeod’s Curatron.
Join our e-mailing list and be kept up to date with all of our news & activity.