Non-Visual Mobile Text-Entry

Since the advent of Apple’s iPhone and its built-in accessibility features, blind people have increased access to mainstream mobile applications. However, the flat surface poses challenges that are only partially solved. Particularly, typing is still slow compared to what sighted people experience.

In this research project we are creating novel non-visual input methods to touch-based form-factors: from tablets to smartwatches.

Project Details

Title: Non-Visual Mobile Text-Entry

Date: Jan 1, 2017

Authors: Hugo Nicolau, Tiago Guerreiro, Kyle Montague, João Guerreiro

Keywords: touchscreen, text-entry, blind, mobile


Related Publications


    • Hybrid-Brailler: Combining Physical and Gestural Interaction for Mobile Braille Input and Editing
    • Daniel Trindade, André Rodrigues, Tiago Guerreiro, and Hugo Nicolau. 2018. Hybrid-Brailler: Combining Physical and Gestural Interaction for Mobile Braille Input and Editing. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ACM, 27:1–27:12. http://doi.org/10.1145/3173574.3173601
    • [ABSTRACT] [PDF] [LIBRARY] [VIDEO]
    • Braille input enables fast nonvisual entry speeds on mobile touchscreen devices. Yet, the lack of tactile cues commonly results in typing errors, which are hard to correct. We propose Hybrid-Brailler, an input solution that combines physical and gestural interaction to provide fast and accurate Braille input. We use the back of the device for physical chorded input while freeing the touchscreen for gestural interaction. Gestures are used in editing operations, such as caret movement, text selection, and clipboard control, enhancing the overall text entry experience. We conducted two user studies to assess both input and editing performance. Results show that Hybrid-Brailler supports fast entry rates as its virtual counterpart, while significantly increasing input accuracy. Regarding editing performance, when compared with the mainstream technique, Hybrid-Brailler shows performance benefits of 21% in speed and increased editing accuracy. We finish with lessons learned for designing future nonvisual input and editing techniques.


    • Effect of Target Size on Non-visual Text-entry Honorable Mention
    • André Rodrigues, Hugo Nicolau, Kyle Montague, Carriço Luı́s, and Tiago Guerreiro. 2016. Effect of Target Size on Non-visual Text-entry. Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, ACM, 47–52. http://doi.org/10.1145/2935334.2935376
    • [ABSTRACT] [PDF] [LIBRARY]
    • Touch-enabled devices have a growing variety of screen sizes; however, there is little knowledge on the effect of key size on non-visual text-entry performance. We conducted a user study with 12 blind participants to investigate how non-visual input performance varies with four QWERTY keyboard sizes (ranging from 15mm to 2.5mm). This paper presents an analysis of typing performance and touch behaviors discussing its implications for future research. Our findings show that there is an upper limit to the benefits of larger target sizes between 10mm and 15mm. Input speed decreases from 4.5 to 2.4 words per minute (WPM) for targets sizes below 10mm. The smallest size was deemed unusable by participants even though performance was in par with previous work.


    • Towards Inviscid Text-Entry for Blind People through Non-Visual Word Prediction Interfaces.
    • Kyle Montague, João Guerreiro, Hugo Nicolau, Tiago Guerreiro, André Rodrigues, and Daniel Gonçalves. 2016. Towards Inviscid Text-Entry for Blind People through Non-Visual Word Prediction Interfaces. Inviscid Text-Entry and Beyond Workshop at Conference on Human Factors in Computing Systems (CHI).
    • [ABSTRACT] [PDF]
    • Word prediction can significantly improve text-entry rates on mobile touchscreen devices. However, these interactions are inherently visual and require constant scanning for new word predictions to actually take advantage of the suggestions. In this paper, we discuss the design space for non-visual word prediction interfaces and finally present Shout-out Suggestions, a novel interface to provide non-visual access to word predictions on existing mobile devices.


    • Typing Performance of Blind Users: An Analysis of Touch Behaviors, Learning Effect, and In-Situ Usage
    • Hugo Nicolau, Kyle Montague, Tiago Guerreiro, André Rodrigues, and Vicki L. Hanson. 2015. Typing Performance of Blind Users: An Analysis of Touch Behaviors, Learning Effect, and In-Situ Usage. Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, ACM, 273–280. http://doi.org/10.1145/2700648.2809861
    • [ABSTRACT] [PDF] [LIBRARY]
    • Non-visual text-entry for people with visual impairments has focused mostly on the comparison of input techniques reporting on performance measures, such as accuracy and speed. While researchers have been able to establish that non-visual input is slow and error prone, there is little understanding on how to improve it. To develop a richer characterization of typing performance, we conducted a longitudinal study with five novice blind users. For eight weeks, we collected in-situ usage data and conducted weekly laboratory assessment sessions. This paper presents a thorough analysis of typing performance that goes beyond traditional aggregated measures of text-entry and reports on character-level errors and touch measures. Our findings show that users improve over time, even though it is at a slow rate (0.3 WPM per week). Substitutions are the most common type of error and have a significant impact on entry rates. In addition to text input data, we analyzed touch behaviors, looking at touch contact points, exploration movements, and lift positions. We provide insights on why and how performance improvements and errors occur. Finally, we derive some implications that should inform the design of future virtual keyboards for non-visual input


    • TabLETS Get Physical: Non-Visual Text Entry on Tablet Devices
    • João Guerreiro, André Rodrigues, Kyle Montague, Tiago Guerreiro, Hugo Nicolau, and Daniel Gonçalves. 2015. TabLETS Get Physical: Non-Visual Text Entry on Tablet Devices. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, ACM, 39–42. http://doi.org/10.1145/2702123.2702373
    • [ABSTRACT] [PDF] [LIBRARY]
    • Tablet devices can display full-size QWERTY keyboards similar to the physical ones. Yet, the lack of tactile feedback and the inability to rest the fingers on the home keys result in a highly demanding and slow exploration task for blind users. We present SpatialTouch, an input system that leverages previous experience with physical QWERTY keyboards, by supporting two-handed interaction through multitouch exploration and spatial, simultaneous audio feedback. We conducted a user study, with 30 novice touchscreen participants entering text under one of two conditions: (1) SpatialTouch or (2) mainstream accessibility method Explore by Touch. We show that SpatialTouch enables blind users to leverage previous experience as they do a better use of home keys and perform more efficient exploration paths. Results suggest that although SpatialTouch did not result in faster input rates overall, it was indeed able to leverage previous QWERTY experience in contrast to Explore by Touch.


    • B#: Chord-based Correction for Multitouch Braille Input
    • Hugo Nicolau, Kyle Montague, Tiago Guerreiro, João Guerreiro, and Vicki L. Hanson. 2014. B#: Chord-based Correction for Multitouch Braille Input. Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, ACM, 1705–1708. http://doi.org/10.1145/2556288.2557269
    • [ABSTRACT] [PDF] [LIBRARY]
    • Braille has paved its way into mobile touchscreen devices, providing faster text input for blind people. This advantage comes at the cost of accuracy, as chord typing over a flat surface has proven to be highly error prone. A misplaced finger on the screen translates into a different or unrecognized character. However, the chord itself gathers information that can be leveraged to improve input performance. We present B#, a novel correction system for multitouch Braille input that uses chords as the atomic unit of information rather than characters. Experimental results on data collected from 11 blind people revealed that B# is effective in correcting errors at character-level, thus providing opportunities for instant corrections of unrecognized chords; and at word-level, where it outperforms a popular spellchecker by providing correct suggestions for 72% of incorrect words (against 38%). We finish with implications for designing chord-based correction system and avenues for future work.


    • Mobile Text-Entry: The Unattainable Ultimate Method
    • Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2012. Mobile Text-Entry: The Unattainable Ultimate Method. Frontiers in Accessibility for Pervasive Computing Workshop at Pervasive.
    • [ABSTRACT] [PDF]
    • There is no such thing as an ultimate text-entry method. People are diverse and mobile touch typing takes place in many different places and scenarios. This translates to a wide and dynamic diversity of abilities. Conversely, different methods present different demands and are adequate to different people / situations. In this paper we focus our attention on blind and situationally blind people; how abilities differ between people and situations, and how we can cope with those differences either by varying or adapting methods. Our research goal is to identify the human abilities that influence mobile text-entry and match them with methods (and underlying demands) in a comprehensive and extensible design space.


    • Blind People and Mobile Touch-based Text-entry: Acknowledging the Need for Different Flavors Best Student Paper Award
    • João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011. Blind People and Mobile Touch-based Text-entry: Acknowledging the Need for Different Flavors. The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 179–186. http://doi.org/10.1145/2049536.2049569
    • [ABSTRACT] [PDF] [LIBRARY]
    • The emergence of touch-based mobile devices brought fresh and exciting possibilities. These came at the cost of a considerable number of novel challenges. They are particularly apparent with the blind population, as these devices lack tactile cues and are extremely visually demanding. Existing solutions resort to assistive screen reading software to compensate the lack of sight, still not all the information reaches the blind user. Good spatial ability is still required to have notion of the device and its interface, as well as the need to memorize buttons‟ position on screen. These abilities, as many other individual attributes as age, age of blindness onset or tactile sensibility are often forgotten, as the blind population is presented with the same methods ignoring capabilities and needs. Herein, we present a study with 13 blind people consisting of a touch screen text-entry task with four different methods. Results show that different capability levels have significant impact on performance and that this impact is related with the different methods‟ demands. These variances acknowledge the need of accounting for individual characteristics and giving space for difference, towards inclusive design.


    • BrailleType: Unleashing Braille over Touch Screen Mobile Phones People’s Choice Award
    • João Oliveira, Tiago Guerreiro, Hugo Nicolau, Joaquim Jorge, and Daniel Gonçalves. 2011. BrailleType: Unleashing Braille over Touch Screen Mobile Phones. In Human-Computer Interaction – INTERACT 2011: 13th IFIP TC 13 International Conference, Lisbon, Portugal, September 5-9, 2011, Proceedings, Part I, Pedro Campos, Nicholas Graham, Joaquim Jorge, Nuno Nunes, Philippe Palanque and Marco Winckler (eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 100–107. http://doi.org/10.1007/978-3-642-23774-4_10
    • [ABSTRACT] [PDF] [LIBRARY]
    • The emergence of touch screen devices poses a new set of challenges regarding text-entry. These are more obvious when considering blind people, as touch screens lack the tactile feedback they are used to when interacting with devices. The available solutions to enable non-visual text-entry resort to a wide set of targets, complex interaction techniques or unfamiliar layouts. We propose BrailleType, a text-entry method based on the Braille alphabet. BrailleType avoids multi-touch gestures in favor of a more simple single-finger interaction, featuring few and large targets. We performed a user study with fifteen blind subjects, to assess this method’s performance against Apple’s VoiceOver approach. BrailleType although slower, was significantly easier and less error prone. Results suggest that the target users would have a smoother adaptation to BrailleType than to other more complex methods.


    • Proficient Blind Users and Mobile Text-entry
    • Hugo Nicolau, Tiago Guerreiro, Joaquim Jorge, and Daniel Gonçalves. 2010. Proficient Blind Users and Mobile Text-entry. Proceedings of the 28th Annual European Conference on Cognitive Ergonomics, ACM, 19–22. http://doi.org/10.1145/1962300.1962307
    • [PDF] [LIBRARY]

    • From tapping to touching: Making touch screens accessible to blind users
    • Tiago Guerreiro, Hugo Nicolau, and Joaquim A Jorge. 2008. From tapping to touching: Making touch screens accessible to blind users. IEEE Multimedia, Citeseer.
    • [ABSTRACT] [PDF]
    • Mobile phones play an important role in modern society. Their applications extend beyond basic communications, ranging from productivity to leisure. However, most tasks beyond making a call require significant visual skills. While screen-reading applications make text more accessible, most interaction, such as menu navigation and especially text entry, requires hand–eye coordination, making it difficult for blind users to interact with mobile devices and execute tasks. Although solutions exist for people with special needs, these are expensive and cumbersome, and software approaches require adaptations that remain ineffective, difficult to learn, and error prone. Recently, touch-screen equipped mobile phones, such as the iPhone, have become popular. The ability to directly touch and manipulate data on the screen without using any intermediary devices has a strong appeal, but the possibilities for blind users are at best limited. In this article, we describe NavTouch, a new, gesture-based, text-entry method developed to aid vision-impaired users with mobile devices that have touch screens. User evaluations show it is both easy to learn and more effective than previous approaches.


    • Mobile Text-entry Models for People with Disabilities
    • Tiago Guerreiro, Paulo Lagoá, Hugo Nicolau, Pedro Santana, and Joaquim Jorge. 2008. Mobile Text-entry Models for People with Disabilities. Proceedings of the 15th European Conference on Cognitive Ergonomics: The Ergonomics of Cool Interaction, ACM, 39:1–39:4. http://doi.org/10.1145/1473018.1473067
    • [PDF] [LIBRARY]

    • Acessibilidade Movel: Solucoes para Deficientes Visuais portuguese flag
    • Paulo Lagoa, Hugo Nicolau, Tiago Guerreiro, Daniel Gonçalves, and Joaquim Jorge. 2008. Acessibilidade Movel: Solucoes para Deficientes Visuais. Proceedings of the 3rd National Conference on Human-Computer Interaction (Interaccao).
    • [ABSTRACT] [PDF]
    • Os dispositivos moveis desempenham um papel importante na sociedade moderna. As suas funcionalidades vão além da simples comunicação, juntando agora um grande leque de funcionalidades, sejam elas de lazer ou de cariz profissional. A interacção com estes dispositivos e visualmente exigente, dificultando ou impossibilitando os utilizadores invisuais de terem controlo sobre o seu dispositivo. Em particular, a introdução de texto, uma tarefa transversal a muitas aplicaçoes, e de difícil realização, uma vez que depende do retorno visual, tanto do teclado, como do ecrã. Assim, através da utilização de novos sistemas de introdução de texto, que exploram as capacidades dos utilizadores invisuais, o sistema apresentado neste artigo oferece-lhes a possibilidade de operarem diferentes tipos de dispositivos. Para alem dos telemóveis comuns, apresentamos também um método de interacção em dispositivos com ecrãs tácteis. Estudos com utilizadores invisuais validaram as abordagens propostas para os varios dispositivos que suplantam os métodos tradicionais ao nível do desempenho, aprendizagem e satisfação do utilizador-alvo.

About

Hugo is an Associate Professor in the Computer Science and Engineering Department (DEI) of Instituto Superior Técnico, University of Lisbon in Portugal. He's also Vice-President and researcher at the Interactive Technologies Institute / LARSyS.

Social Links

My Office

Instituto Superior Técnico,
Computer Science and Engineering Department,
Av. Rovisco Pais 1
1049-001, Lisbon
Portugal