How are blind Persons using digital Technology?

For many blind people, computers have become an integral part of everyday life. I'm not a digital native, but without a computer I wouldn't have completed my diploma, nor could I do my current job or write this article. It's true that blind people have written books or done scientific work before. However, this work is much more difficult without computers and other digital tools. A blind person can write text on a typewriter, but he cannot check the text for typos himself. The amount of content that a student has to process today is difficult for a blind person to handle without technical aids.

This article is a little technical, but I hope it's still understandable.

Article Content

The Personal Computer

Let's pretend you've never seen a computer before. You are standing in front of a box made of metal and plastic with a display. A bunch of cables stick out of the device. You turn the thing on and numerous colorful symbols and texts appear. At first you can't do anything with it. Then someone puts a mouse in your hand. You start clicking through, you learn how to start programs, what they do and how to work with them.

Things are not significantly different for blind people. However, they don't see the icons, they don't see the mouse cursor, and they don't know where to start. It's about translating a surface that was created for sighted people into something that can be understood by blind people.

How a Screen Reader works

Blind people use assistive devices to operate their computers. The interface to the computer is the screen reader. The screen reader reads the screen content and outputs the information as speech or as Braille on a special Braille display. Screen readers and speech output are often used interchangeably, but they are programs with different tasks.

The screen reader is the interface between the operating system's mostly graphical user interface and the output medium, either speech or Braille. Theoretically, another output medium would also be conceivable, with a lot of information being output as an acoustic signal or, on smartphones, as a vibration. Independently of this, speech output is an output medium in which phonemes are put together according to certain rules and output as speech. Phonemes are the sounds that make up language.

The screen reader depends on what information the operating system provides. The appearance of an element is less important than its function. Scissors or trash may speak for themselves as graphics, but cutting and deleting are easier to understand. The so-called accessibility API is crucial for outputting information to the assistive software.

Operating systems are not “naturally” accessible. The information must be stored in the system in such a way that the assistive software can read it. The standard by which this information is created is the accessibility interface. All common operating systems now have such interfaces. There are still more or less major weaknesses in the systems, but in principle they are considered usable by disabled people.

However, the information must not only be stored in the operating system, but also in the individual programs. For blind people, every newly installed program can bring surprises in terms of accessibility. All major operating system manufacturers provide instructions on how to make programs accessible.

Okay, but what does that mean exactly? Imagine you are writing a document in Word. You need to:

  • Read the text,
  • write or change text yourself and
  • Be able to access and use menus.

That sounds obvious and for you it is. For the blind it is not at all. If the accessibility interface information is missing, the screen reader will not know what to do with the program. The blind person does not know whether he can read text, enter a command or retrieve information.

So let's go back to Word: When a document is freshly opened, the screen reader receives the information that it is in a multi-line text field. He can now write, change or read text. He now wants to change the font of a section of text. He highlights the text he wants to change and presses the Alt key to bring up a menu. Blind people only use the keyboard to control the computer. The screen reader now receives the information that the focus is in a menu. The blind person moves through the menu using the arrow keys, with the screen reader telling them which element is currently in focus and whether it can be called up. At some point the user lands on the font formatting menu and accesses it. There he will find several so-called selection fields. He tabs into the font field and uses the arrow keys to select the desired font. Now he continues tabbing until he comes to a button called “OK”. This button is announced to him as a button labeled “OK”. The blind person presses return and the formatting is done.

I have described this process in detail here to give you a clear example. In reality, Word and similar programs have so-called keyboard shortcuts that allow experienced users to work much more quickly. An experienced blind user can in some cases be faster than a mouse user.

It should be clear that the screen reader doesn't just read the text that is on the screen. In fact, in addition to the on-screen text, it also outputs the type of element and its “state” or options. The blind person knows from experience what he can do with the element. A checkbox, for example, may be checked, it may not be checked, or it may be partially checked. With this information, the user knows that they can either change the box or leave it unchanged.

Every element on the screen has a task. Some are just for decorative purposes, many are for informational purposes, and many are for making a difference. This task becomes apparent to the person who sees through the visual design or experience. In the course of your computer life you have learned that these bars on the edge of the program are used to move a section of the program. For blind people, what you see as a bar is not initially visible; this element has to be translated for them. So the sliding thingy becomes a scroll bar. Just as you know from experience that you can use this bar to move the program section, the blind person knows the same when he hears scroll bar.

A program is an arrangement of decorative, informational and changeable elements. The menus are accessed via the keyboard and activated or deactivated. The screen reader can read and describe information, for example the formatting of a text or a table cell.

Blind people generally do not know how a program is structured graphically. Although they often know the complex menus of Word etc. almost by heart, they have no idea what the program looks like for sighted people. Whether this is important to them can be discussed. However, it often means that training is much more difficult for blind people than for sighted people. There are numerous aids for sighted people, such as: B. self-explanatory icons that are logically arranged in bars. For blind people there are often only menus and shortcuts.

The blind person works a lot with keyboard shortcuts. This enables him to work almost as quickly as a sighted person. It's often even faster: While a sighted person has to mark text with the mouse and tap the "bold" button, the blind person simply marks the text with the shift key, presses their keyboard shortcut and he's done. I always have to laugh when I hear sighted people moving from input field to input field with the mouse. Blind people do this much more elegantly; they don't have to take their hands off the keyboard to fill out a web form.

In the next sections we want to take a closer look at the output formats.

The voice output

As already mentioned, voice output outputs information as speech. It receives the information from the screen reader. The screen reader can output a lot of information about the user interface. It doesn't just explain what elements there are and what status they have. A freshly installed screen reader also tells you what you can do with the elements. This is intended to help beginners find their way around more quickly. Imagine you are reading a text out loud. Hardly anyone would think of reading commas, hyphens and quotation marks. However, blind people's screen readers do this by default. This can also make sense; programmers, mathematicians or proofreaders have to have every character announced. Thankfully, you can configure the output very finely; it can be very annoying when all the punctuation marks are read out.

Speech output is preferred by many blind people as an output medium over Braille because it allows a much higher reading speed. Many blind people adjust the language 50 to 100 percent faster and get headaches when the program reads at normal speaking speed. The human reading speed is usually 150 words per minute. Some blind users speed the screen reader up to 400 words per minute.

During speech output, phonemes, i.e. parts of words, are put together at high speed. Today's speech editions pay attention to the subtleties of the speech melody, so they raise the voice at a comma and go down at a point.

There are essentially two forms of voice output: Synthetic voices used to be the first choice. The phonemes were created synthetically, which meant that the programs were relatively compact and ensured high speed of the screen reader. It wasn't that long ago that computers weren't very powerful; better voice quality has only been possible in the last few years. The voice output of blind people therefore sounds anything but natural.

Incidentally, the innovation drivers in the further development of voice output are not the blind people who have already gotten used to the artificial voices. Rather, the demand from manufacturers of smartphones and telephone dialogue systems increases the need for research. The automated telephone services may be unpopular, but if the robotic-sounding voice output of blind people were used, no one would use them anymore. Today, voice assistants are part of everyday life for many sighted smartphone users.

Braille on the computer

Braille is, next to speech output, the second important output medium. The Braille is displayed on Braille displays - Braille lines in German. The German name fits better here; These are devices that output information on a single line, with Braille displayed using movable pens that can be moved in or out of the device at lightning speed as needed. More information about Braille can be found in the chapter “Blind people and media”. Today there are very different Braille displays: Small Braille displays for smartphones can display 8 to twelve characters, large lines show up to 80 characters. That sounds like a lot, but it isn't. Braille requires more space than the same text in written letters, and the screen reader outputs significantly more information than can be seen on the screen. A line of text in Word can contain more than 100 characters.

Even a small Braille display with ten displayable characters costs around 1,000 euros, while a large Braille display with 80 displayable characters costs 10,000 euros. The future of Braille will therefore depend crucially on whether these devices can be made significantly cheaper. Braille has lost of importance in the digital age.

Haptic and acoustic feedback

Since the development of smartphones, haptic and acoustic feedback has also played a greater role. Since 2008, Apple has been selling its devices with a built-in screen reader. Android has also followed suit with version 4.0 and offers similar functions.

You need feedback to find out whether an interaction was successful. With smartphones, you can't always be sure whether you tapped the right symbol or whether the command was processed correctly.

That's why there is feedback for blind people in the form of sounds or vibrations. The swipe gesture or activation of an element is accompanied by an appropriate sound; there is also an error sound for commands that were not accepted or understood by the device. With a well-thought-out sound design, these tones are just as meaningful as graphic feedback. Acoustic feedback also plays an important role for blind people on desktop computers. There is a clicking sound when switching between folders or a warning tone when you want to type something but there is actually no input field.

Vibrations are not as diverse as sound, but they can also be useful. For example, on Android there is a vibration when you tap an icon. This can be useful for blind people who are hard of hearing or in noisy environments.

Smartphones and tablets

But how can blind people even use touch surfaces? It's not that difficult; In fact, touchscreens offer blind people the opportunity to understand the visual structure of websites, something they cannot do with a classic computer. For blind people, a website is a linear arrangement of elements. There is only before and behind or I have heard or I haven't heard yet. Of course, they don't know what they haven't heard before.

With a touchscreen, the blind person swipes a finger across the surface. When he swipes over an element such as a button, it is announced to him. This is called “Explore by Touch”. When it finds a clickable or activatable element, such as a link, it switches from an exploration mode to a command mode. He first hears that he has found the right symbol. If he double taps the clickable element, it will be activated. In principle, blind people can use very similar touch gestures as sighted people. The sighted person swipes two fingers across a web page to scroll, the blind person uses three fingers. Like other programs, the screen reader on the smartphone is operated using touch gestures.

That sounds more complicated than it is. Once you get used to it, it goes like a breeze. Just writing a long text isn't particularly pleasant on the touchscreen, but that's no different for many sighted people.

Blind people on the World Wide Web

The Internet is one of the most important areas of computer use and is generally easy to use with screen readers. The screen reader is not based on the visual appearance, but on the structure of a website. Elements that appear parallel to the sighted user are arranged linearly to the screen reader user.

While the sighted person can distinguish important elements such as navigation and text from decorative elements such as banners or advertising at a glance, the blind person first has to grasp all the elements once in order to be able to find their way around the website. The content of images, graphics and animations remains invisible to blind people.

Like sighted people, blind people have their favorite websites that they know inside out and can therefore easily navigate. Websites that are rarely visited or have a lot of graphic or interactive content are difficult to use.

To the viewer, the website appears as a juxtaposition and layer of information segments, the banner is above the content, the content is next to the navigation, and so on. For the blind, however, there is only linearity, that is, one element is in front of or behind another element. This also applies to small touchscreens like the iPhone. The blind person taps somewhere on the screen and captures a specific element, for example a link. He then swipes to the right and captures the next element, which could be a paragraph of text.

On well-structured websites, blind people can use certain commands to navigate to specific areas such as the content or a form. For a trained blind person, even complex web applications such as YouTube are no longer a big challenge. Unfortunately, many websites and applications are poorly programmed and difficult to use for blind people.

Occasionally I am asked whether we should develop special websites or apps for blind or other disabled people. That sounds to make sense at first. The focus of today's user interface development is on the visual level. Accessibility or feedback via sound are often neglected. Blind people therefore often have difficulty using such systems. However, special solutions for certain groups always have disadvantages. They tempt you to neglect the accessibility of the mainstream versions. Special solutions are more expensive. They are developed more slowly or stopped altogether if the developer goes bankrupt or is taken over by a competitor.

But it's also simply about the fact that disabled people also must have the opportunity to exchange ideas with non-disabled people, and this is made much more difficult if they use different systems. The charm of the iPhone lies not least in the fact that you can easily discuss apps with a sighted person. If I have a problem with my device, I can simply have a sighted person look at it, but with a special device I would have to ask an expert who would of course want money for the help.

More on Blindness