A test-bed for text-to-speech-based pedestrian navigation systems
2013 (English)In: Natural Language Processing and Information Systems: 18th International Conference on Applications of Natural Language to Information Systems, NLDB 2013, Salford, UK, June 19-21, 2013. Proceedings / [ed] Elisabeth Métais, Farid Meziane, Mohamad Saraee, Vijayan Sugumaran, Sunil Vadera, Springer Berlin/Heidelberg, 2013, 396-399 p.Conference paper (Refereed)
This paper presents an Android system to support eyes-free, hands-free navigation through a city. The system operates in two distinct modes: manual and automatic. In manual, a human operator sends text messages which are realized via TTS into the subject's earpiece. The operator sees the subject's GPS position on a map, hears the subject's speech, and sees a 1 fps movie taken from the subject's phone, worn as a necklace. In automatic mode, a programmed controller attempts to achieve the same guidance task as the human operator. We have fully built our manual system and have verified that it can be used to successfully guide pedestrians through a city. All activities are logged in the system into a single, large database state. We are building a series of automatic controllers which require us to confront a set of research challenges, some of which we briefly discuss in this paper. We plan to demonstrate our work live at NLDB.
Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2013. 396-399 p.
, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), ISSN 0302-9743 ; 7934
Computer and Information Science
IdentifiersURN: urn:nbn:se:umu:diva-83209DOI: 10.1007/978-3-642-38824-8_47ISBN: 978-3-642-38823-1 (Print)ISBN: 978-3-642-38824-8 (Online)OAI: oai:DiVA.org:umu-83209DiVA: diva2:668728
18th International Conference on Application of Natural Language to Information Systems, NLDB 2013, 19 June 2013 through 21 June 2013, Salford