Date – 2018-2019
Type – Interactive display using eye tracking
Location – Lausanne
Client – Plateforme 10
Credit – photos: Yannick Luthy, Gionna Mottua
From October 2018 to January 2019, INT is in residence at the Musée de l'Elysée to develop a sound narration device driven by eye movement. Unlike a traditional audioguide, the system aims to provide information about the work and its context based on what the visitor is looking at.
This residency is part of a research project conducted by the three museums of PLATEFORME 10 under the initiative of Engagement Migros. After exploring high-definition digitisation, the project continues to create tangible links between the visitor and the digitised content. Controllers such as eye tracking offer new possibilities for interaction and open the way to new forms of interactive content.
Beyond the purely technical aspect, the residency aims to explore a form of non-linear narrative. The gaze influences the information and the information influences the gaze. Who controls who? Can this principle be applied to present an artist's work? How to create a coherent scenario? How do visitors react to the device?
The first prototype presents REEF IDLIB, May 3, 2014, a work by Matthias Bruggmann. This image shows antique dealers cleaning antique parts. It is part of the exhibition Un acte d'une violence indescriptible presented in the basement of the Musée de l’Elysée. By exposing us to the layers of information of a complex war, Matthias Bruggmann reveals stories underlying the conflict that do not appear at first glance.
From 16th of October till the 4th of December we anonymously recorded the course of the gaze from visitors on the photograph. The result of our experimentation can be seen both as printed and interactive data visualisation. (See images below).
How long do people stay to observe the artwork and listen to its hidden narrative ? What are the key elements that intrigued & attracted the visitor on the image? How could we create interesting & meaningful data visualisation out of the recorded data ?
Thanks to Brian Bendahan for the sound recording and editing, Claire Nicolas for her voice and Matthias Bruggmann for his implication in the project.