A large volume of audio-visual content online
Artists, museums and the heritage sector are creating ever-increasing amounts of audio-visual content. One of the biggest issues facing the museum and heritage sector over the coming years will be how to manage and distribute that content to the public and research sector.
Tate is working in collaboration with Goldsmiths College, University of London, Department of Computing and the Goldsmiths Leverhulme Media Research Centre, as part of the Metadata Project, to produce an open-source application for tagging, searching and retrieving audio/video content online.
Intuitive and collaborative tagging and searching
The application aims to demonstrate the potential for an intuitive search engine that allows new forms of tagging and searching audio-visual content: time-based, intensity-based and collaborative. Users will not only be able to tag videos as full entities, but tag any moment in a video or audio file. They will also be able to set the intensity of a tag: a moment in a video or audio file can therefore be described as, for example, ‘very much about Jean-Luc Godard’ or ‘a little bit about Jean-Paul Belmondo’. Users will tag and set the intensity of the tags in collaboration with other users, along the lines of a wiki for tags.
Users will also be able to search content in very detailed ways. They will be able to identify relevant moments directly and see what other users got passionate about. Topic-specific heat maps and time-dynamic tag clouds will provide a new experience of using audio-visual content. Tagging and retrieval will be unified in one intuitive user experience.
A user-centred tool
Ultimately, the project aims to develop a user-centred tool that will allow audiences and academics to quickly tag, search, retrieve and play results drawn from large volumes of long-play content, as well as collectively negotiate their meaning.