AND Hack: The Results!
by Gabrielle Jenks, Director, Abandon Normal Devices
This week AND hosted Anatomy Of The Archive, a two-day intensive lab as part of This Way Up 15. We invited participants to work collaboratively to interrogate the way we use, access and interact with moving image archives.
Here are the resulting projects developed by our seven teams:
Shelly and Neil developed #ReviewerBot 1.0 – a physical robot designed to roam the foyer and corridors of a cinema before screenings, spouting reviews for specific films. The idea is that the robot will become another voice added to the many audience discussions that precede a film viewing and an ambient intervention in this space. The bot will generate content for its speeches by collecting and analysing existing online reviews for the film from archives like IMDB.
Audiences will be able to interact with the bot by using the hashtag #ReviewerBot to tweet their own expectations prior to watching a film – this will directly affect the bots positive or negative position on the film, and the ‘character’ of the robot they will encounter at the cinema.
Chris and Ian’s project was a study of the ever expanding archive and a provocation on the integrity of the data within it. Archives are not flat structures. Media is stored at different levels of retrievability depending on it’s age and importance. Older, unpopular artefacts are pushed further and further into the depths until they are forgotten. Current social media favours ‘now’ pushing recent, popular content to the top of the pile.
Who or what determines the position of content within the archive? This project is a visual representation of this. By combining multiple data sources: from Cornerhouse, BFI and IMDB Chris and Ian created an algorithm that uses a film’s year, number of reviews and screenings and popularity rating to create a single ‘freshness’ rating. This freshness, the position within the archive, is visually represented by manipulating the film’s poster image to create glitchy compression artefacts. By viewing an item, its freshness is increased thus reducing glitch and pushing the media further up the levels of the archive.
Primarily a provocation on the integrity of the expanding archive, this project helps us to visualise what may have been forgotten. Taken further it could be developed into a more visually rich installation, tailored to specific archives or aggregating many.
Vicky Sherrat and Abi Goodman
This project was inspired by Dylan Marron, who watched major films and then edited them to show just the parts where people of colour speak – often reducing movies to just a few seconds. Vicky and Abi were interested in doing the same thing with gender representation in film. Rather than analysing and edited the footage manually, as Marron does, they created a program that scans the metadata in scripts of films. This analysis created new datasets that reveal the proportion of dialogue spoken by women in films. In terms of archive, their approach is a way to sift through metadata, to reveal trends and hidden narratives in the archive, which can be shown easily with graphics.
Florence Okoye and Tim Shaw @Tim4Shaw
Florence and Tim focussed on a location based experience of archive (which is also a non-linear experience) and also how users of the archive can also contribute to it. They chose film footage from the BFI Britain on Film archive of Manchester in 1961, found the location and revisited the site. On their journey to, from and at the site, they collected sound, data and moving image (a micro-archive), as a way of recording this place ‘now’.
They then overlaid this data with the original archive footage, in a website. These activities demonstrate that an archive can be a continuation of various materials through time, not just a database of the past. Their work presents a non traditional archive experience, that closely connects the film to place, but also allows for others to add to it and build archives that tap into the user as curator, without barriers of needing expert knowledge.
Paul McManus and Nikos Stylianou
Paul and Nikos are software developers from the BFI and in their project they asked big data to reveal it’s secrets. They are interested in how you can take awkwardly formatted archive databases and make them show relationships between the content. They worked exclusively with the BFI’s internal database, made up of 8 hundred thousand films and TV programs and 1.8 million people and organisations.
During the hack they took that data out of it’s awkward silo’d format and made the pieces of data ‘talk’ to each other. This meant that we could understand and get the answers to many film related questions, such as – who collaborated with Alfred Hitchcock the most? Or, how many horror films were released in the 1980’s?
Paul and Nikos can now find out the answers to all sorts of questions relating to the archive, and see their work as being useful for other developers working with archives who might want to find out, for example – just how many degrees of separation are there between Dolly Parton and Wes Anderson?
Hwa and Sandi’s hack was the start of a longer project which, on a practical level, tackles the issue of how to share personal archives and how to digitise them for the future.
The archive worked with is Sandi’s personal moving image archive, that documents over 30 years of LGBTQ and black communities in Liverpool, from the 1970s onwards. Sandi’s personal documentation of an underground scene captures stories that were not represented in mainstream media, covering local cultural festivals, gay club nights alongside intimate and frank conversations within the communities.
In the wider project, there will be three short films made as outputs. During the hack Sandi and Hwa Young started to reformat some of the personal archives and created a short film which compares the Google image result representation of LGBTQ black communities with one woman’s frustration of mainstream representation.
Cristina Tarquini @CristinaTarquin and Dominik Koller
Cristina and Dominik wondered what if an archive changed what it shows you based on your emotions? This led to some other questions – What is the value of art created by a machine? Is creativity exclusive to humans or can computers be genuinely creative? Is emotion a key part of creativity and can machines be emotional? the future of creativity and art.
Simplify these big questions for a two day hack, they created a program that shows an image drawn from an ‘archive’ and reads your face to see whether you smile at it or not. Based on this emotional feedback, it goes on to show similar or different images from the archive correspondingly to make you smile. As the size of the chosen archive increases, the algorithm is designed to select emotional images more effectively. The principle works with a variety of content, including moving images and anything that creates an emotional response of the viewer.
While this is a basic example, it brings up important questions about the nature and value of art in a world where computers might become as creative as we are. And for archive, it explores the possibility of an archive changing what it shows you based on your emotions.