Beyond hand-eye coordination: An exploration of eye-tracking and speech recognition as a navigation tool for interactive systems
Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
The human’s ability to see, listen and speak is naturally embedded in how we interact and communicate with each other, but not only do we interact with other humans, we also spend a lot of time interacting with computers. In our study we take a starting point in embodied interaction and draw on people’s abilities from everyday life and apply them to computation in form of eye-tracking and speech recognition. Previous research mainly explored these inputs separately and little has been discovered regarding the combination. We applied a qualitative approach consisting of free surfs, task based evaluations and ten interviews, and we aimed for an understanding of how people perceive this interaction and to discover potential use contexts. The results indicate that people are positive towards the combination of eye-tracking and speech recognition for interacting with computers but found it hard to imagine a rich set of contexts in which it could be used.
Place, publisher, year, edition, pages
2015. , 25 p.
Informatik Student Paper Master (INFSPM), 2015.06
embodied interaction, eye-tracking, speech recognition, eye gaze, google maps, midas touch
IdentifiersURN: urn:nbn:se:umu:diva-104882OAI: oai:DiVA.org:umu-104882DiVA: diva2:821426
Master's Programme in Human-Computer Interaction
Harr, Rikard, professor
Waterworth, John, professor