skip to main content
10.1145/2663204.2663243acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Non-Visual Navigation Using Combined Audio Music and Haptic Cues

Authors Info & Claims
Published:12 November 2014Publication History

ABSTRACT

While a great deal of work has been done exploring non-visual navigation interfaces using audio and haptic cues, little is known about the combination of the two. We investigate combining different state-of-the-art interfaces for communicating direction and distance information using vibrotactile and audio music cues, limiting ourselves to interfaces that are possible with current off-the-shelf smartphones. We use experimental logs, subjective task load questionnaires, and user comments to see how users' perceived performance, objective performance, and acceptance of the system varied for different combinations. Users' perceived performance did not differ much between the unimodal and multimodal interfaces, but a few users commented that the multimodal interfaces added some cognitive load. Objective performance showed that some multimodal combinations resulted in significantly less direction or distance error over some of the unimodal ones, especially the purely haptic interface. Based on these findings we propose a few design considerations for multimodal haptic/audio navigation interfaces.

References

  1. R. F. Baumeister, E. Bratslavsky, C. Finkenauer, and K. D. Vohs. Bad is stronger than good. Review of General Psychology, 5(4):323, 2001.Google ScholarGoogle ScholarCross RefCross Ref
  2. S. Bosman, B. Groenendaal, J. Findlater, T. Visser, M. De Graaf, and P. Markopoulos. Gentleguide: An exploration of haptic output for indoors pedestrian guidance. International Conference on Human Computer Interaction with Mobile Devices and Services, 191(204):358--362, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  3. R. Etter and M. Specht. Melodious walkabout-implicit navigation with contextualized personal audio contents. Third International Conference on Pervasive Computing, 191(204), 2005.Google ScholarGoogle Scholar
  4. M. Hara, S. Shokur, A. Yamamoto, T. Higuchi, R. Gassert, and H. Bleuler. Virtual environment to evaluate multimodal feedback strategies for augmented navigation of the visually impaired. In Engineering in Medicine and Biology Society, pages 975--978, Aug 2010.Google ScholarGoogle ScholarCross RefCross Ref
  5. S. Holland, D. R. Morse, and H. Gedenryd. Audiogps: Spatial audio navigation with a minimal attention interface. Personal and Ubiquitous Computing, 6(4):253--259, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. M. Jones, S. Jones, G. Bradley, N. Warren, D. Bainbridge, and G. Holmes. Ontrack: Dynamically adapting music playback to support navigation. Personal and Ubiquitous Computing, 12(7):513--525, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. R. L. Klatzky, J. R. Marston, N. A. Giudice, R. G. Golledge, and J. M. Loomis. Cognitive load of navigating without vision when guided by virtual sound versus spatial language. Journal of Experimental Psychology: Applied, 12(4):223--32, Dec. 2006.Google ScholarGoogle ScholarCross RefCross Ref
  8. M. Liljedahl, S. Lindberg, K. Delsing, M. Polojärvi, T. Saloranta, and I. Alakärppä. Testing two tools for multimodal navigation. volume 2012, Jan. 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. J. L. Nasar and D. Troyer. Pedestrian injuries due to mobile phone use in public places. Accident Analysis & Prevention, 57:91--95, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  10. M. Pielot, B. Poppinga, W. Heuten, and S. Boll. 6th senses for everyone!: The value of multimodal feedback in handheld navigation aids. In International Conference on Multimodal Interfaces, pages 65--72, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. M. Pielot, B. Poppinga, W. Heuten, and S. Boll. A tactile compass for eyes-free pedestrian navigation. In Human-Computer Interaction, INTERACT 2011, volume 6947, pages 640--656. 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. S. Rümelin, E. Rukzio, and R. Hardy. Naviradar: A novel tactile information display for pedestrian navigation. In Symposium on User Interface Software and Technology, pages 293--302, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. K. Tsukada and M. Yasumura. Activebelt: Belt-type wearable tactile display for directional navigation. In Ubiquitous Computing, volume 3205, pages 384--399. 2004.Google ScholarGoogle Scholar

Index Terms

  1. Non-Visual Navigation Using Combined Audio Music and Haptic Cues

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Conferences
      ICMI '14: Proceedings of the 16th International Conference on Multimodal Interaction
      November 2014
      558 pages
      ISBN:9781450328852
      DOI:10.1145/2663204

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 November 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ICMI '14 Paper Acceptance Rate51of127submissions,40%Overall Acceptance Rate453of1,080submissions,42%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader