ABSTRACT
While a great deal of work has been done exploring non-visual navigation interfaces using audio and haptic cues, little is known about the combination of the two. We investigate combining different state-of-the-art interfaces for communicating direction and distance information using vibrotactile and audio music cues, limiting ourselves to interfaces that are possible with current off-the-shelf smartphones. We use experimental logs, subjective task load questionnaires, and user comments to see how users' perceived performance, objective performance, and acceptance of the system varied for different combinations. Users' perceived performance did not differ much between the unimodal and multimodal interfaces, but a few users commented that the multimodal interfaces added some cognitive load. Objective performance showed that some multimodal combinations resulted in significantly less direction or distance error over some of the unimodal ones, especially the purely haptic interface. Based on these findings we propose a few design considerations for multimodal haptic/audio navigation interfaces.
- R. F. Baumeister, E. Bratslavsky, C. Finkenauer, and K. D. Vohs. Bad is stronger than good. Review of General Psychology, 5(4):323, 2001.Google ScholarCross Ref
- S. Bosman, B. Groenendaal, J. Findlater, T. Visser, M. De Graaf, and P. Markopoulos. Gentleguide: An exploration of haptic output for indoors pedestrian guidance. International Conference on Human Computer Interaction with Mobile Devices and Services, 191(204):358--362, 2003.Google ScholarCross Ref
- R. Etter and M. Specht. Melodious walkabout-implicit navigation with contextualized personal audio contents. Third International Conference on Pervasive Computing, 191(204), 2005.Google Scholar
- M. Hara, S. Shokur, A. Yamamoto, T. Higuchi, R. Gassert, and H. Bleuler. Virtual environment to evaluate multimodal feedback strategies for augmented navigation of the visually impaired. In Engineering in Medicine and Biology Society, pages 975--978, Aug 2010.Google ScholarCross Ref
- S. Holland, D. R. Morse, and H. Gedenryd. Audiogps: Spatial audio navigation with a minimal attention interface. Personal and Ubiquitous Computing, 6(4):253--259, 2002. Google ScholarDigital Library
- M. Jones, S. Jones, G. Bradley, N. Warren, D. Bainbridge, and G. Holmes. Ontrack: Dynamically adapting music playback to support navigation. Personal and Ubiquitous Computing, 12(7):513--525, 2008. Google ScholarDigital Library
- R. L. Klatzky, J. R. Marston, N. A. Giudice, R. G. Golledge, and J. M. Loomis. Cognitive load of navigating without vision when guided by virtual sound versus spatial language. Journal of Experimental Psychology: Applied, 12(4):223--32, Dec. 2006.Google ScholarCross Ref
- M. Liljedahl, S. Lindberg, K. Delsing, M. Polojärvi, T. Saloranta, and I. Alakärppä. Testing two tools for multimodal navigation. volume 2012, Jan. 2012. Google ScholarDigital Library
- J. L. Nasar and D. Troyer. Pedestrian injuries due to mobile phone use in public places. Accident Analysis & Prevention, 57:91--95, 2013.Google ScholarCross Ref
- M. Pielot, B. Poppinga, W. Heuten, and S. Boll. 6th senses for everyone!: The value of multimodal feedback in handheld navigation aids. In International Conference on Multimodal Interfaces, pages 65--72, 2011. Google ScholarDigital Library
- M. Pielot, B. Poppinga, W. Heuten, and S. Boll. A tactile compass for eyes-free pedestrian navigation. In Human-Computer Interaction, INTERACT 2011, volume 6947, pages 640--656. 2011. Google ScholarDigital Library
- S. Rümelin, E. Rukzio, and R. Hardy. Naviradar: A novel tactile information display for pedestrian navigation. In Symposium on User Interface Software and Technology, pages 293--302, 2011. Google ScholarDigital Library
- K. Tsukada and M. Yasumura. Activebelt: Belt-type wearable tactile display for directional navigation. In Ubiquitous Computing, volume 3205, pages 384--399. 2004.Google Scholar
Index Terms
-
Non-Visual Navigation Using Combined Audio Music and Haptic Cues
-
Recommendations
-
Effects of directional haptic and non-speech audio cues in a cognitively demanding navigation task
NordiCHI '14: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, FoundationalExisting car navigation systems require visual or auditory attention. Providing the driver with directional cues could potentially increase safety. We conducted an experiment comparing directional haptic and non-speech audio cues to visual cueing in a ...
-
Attracktion: Field Evaluation of Multi-Track Audio as Unobtrusive Cues for Pedestrian Navigation
MobileHCI '20: 22nd International Conference on Human-Computer Interaction with Mobile Devices and ServicesListening to music while being on the move is common in our headphone society. However, if we want assistance in navigation from our smartphone, existing approaches either demand exclusive playback through the headphones or impact the listening ...
-
An initial usability assessment for symbolic haptic rendering of music parameters
ICMI '05: Proceedings of the 7th international conference on Multimodal interfacesCurrent methods of playlist creation and maintenance do not support user needs, especially in a mobile context. Furthermore, they do not scale: studies show that users with large mp3 collections have abandoned the concept of playlists. To remedy the ...
Comments