Some top tips for your website

16/01/2018


The uptake of mobile websites and apps, means that we are doing more than ever before via our smart devices. No matter if we are reading, studying, shopping or using social media; gone are the days when we need to sit at a computer. What follows is some top tips to make your website and app accessible.


Include a clear and logical headings structure

Including a clear headings structure, means that everyone can identify the different sections of the page. Headings are of particular benefit to screen reader users, as shortcuts can be used to navigate the headings if needed. If no headings structure is available, it can take longer to find information on a web page or in an app.


Include clear link text and focus highlighting

A clear link text will enable users to identify what will happen when a link is selected. Links such as ‘click here’, or ‘read more’, are not clear examples of link text. Consider using ‘click here to find out more about DAC’ for example, or ‘read more about our services’ is a good solution. Including clear highlighting when users navigate through a series of links is a good solution for low-vision users moving forward.


Avoid relying on colour alone to convey information

If a form is submitted with mistakes, a clear error handling process should be used. If items for correction are marked in red for example, screen reader users will not be aware of what is required. Including focus to the error messages will be a good work around for this.

Latest updates to NVDA 2017.4

13/12/2017


By Mike Jones Screen Reader Analyst (DAC)


Anyone who uses Non Visual Desktop Access (otherwise known as the NVDA screen reader) for windows, will be aware of some recent problems when using the software. What follows is an overview of the latest improvements and fixes to NVDA if you update to the current version, and some improvements to how NVDA works with Mozilla Firefox.


The elements list

The elements list ‘insert+f7’, now includes menus for form fields and buttons. This is an addition to the previous menus available which were links, headings and landmarks, and will make identifying edit fields and buttons much more efficient and less problematic. Fieldset and legend has now been provided with more support.


The previous situation was that NVDA would not announce the fieldset, even where the developer had provided this information. After significant investigation, I have found that the form field menu found within the elements list still does not support this area. However when the user navigates using the ‘f’ key which navigates to the next form field, the ‘r’ key to move to the next radio button, or the tab key to move to the next element some support is given. I found that when using these keys the fieldset and legend announces for the first radio button on the page, however this does not extend to other radio buttons within that group. I also found that once I had navigated past the radio button, and used shift+f to move to the previous form field, the fieldset and legend announced for the last radio button in the group. Where multiple questions appear on a page, the fieldset and legend will announce for each first option of the group, meaning it is now far easier to distinguish between questions when using one of these key strokes.


This affects us when testing with NVDA, as it now means that to some extent we can now test for fieldset and legend wen using radio buttons, and so can directly compare with JAWS. However I would caution against relying on this 100%, because although this area has improved significantly for the last update, it is still not reading for all radio buttons within a group. As such I would now recommend the following methods of testing (when not browsing using the cursor keys)


Testing with NVDA and looking for headings

To test for the levels of heading, use the ‘H’ key, as at present the heading levels are still not supported using the elements list ‘insert+f7.’


Form element labels

The user can now test for form element labels by using the ‘f’ key for all form elements, the ‘e’ key for edit fields, the ‘b’ key for buttons and can now locate these items using the elements list.


Radio buttons/fieldset and legend

The User should test using either the tab key, the ‘r’ key or the ‘f’ key,’ remembering that only the first option of a group will announce when cycling forward, and the last option when cycling backwards. The fieldset and legend is still not supported within the elements list, so this should be avoided for testing purposes.


A note on NVDA and Firefox

NVDA 2017.4 appears to have partially fixed the issue announced in November relating to Firefox, and NVDA appears to work in some instances. We would advise keeping in touch with the latest developments from NV Access on this, and other developments relating to NVDA by visiting their blog at: NVDA’s In Process Blog (external link.)

The future of Artificial Intelligence: A future for all

07/08/2017


We now live in a world where artificial intelligence, and assistive technology is more accessible than ever before. In my previous post ‘the update round up’, I highlight some of the new updates to Apple, Windows, Android and iOS; and how each offering will improve access to content on mobile and desktop devices for various user groups. What about the day-to-day usage of artificial intelligence though? It’s actually closer to hand than we think.


Artificial intelligence or (AI), is fast becoming the norm in our daily lives. The first thing to identify is that it doesn’t just help people who have additional access requirements, all users regardless of whether or not they use assistive hardware or software benefit from using AI. If you have ever asked a virtual assistant such as Siri, Google, Alexa or Cortana to do something, you’re using AI. The technology is also developing to learn what we use most of all, and adapt to our digital habits. So if you frequently use Cortana to open aps or set reminders, it will become familiar with this task, and any others you use.


The use of AI can be incorporated in to apps, something which is on the increase with updates to the various desktop and mobile operating systems. This means that any third-party items which are installed to a device will be able to take advantage of AI, as long as the developer has included this functionality when producing the app. One app for iOS which is aimed at supporting blind or low vision users is Seeing AI. The app has various features including document scanning, a barcode reader, and the ability to share information via the iOS share sheet. This means that the app will be able to identify items from the camera roll, allowing users to include names for individuals in a picture such as relatives for example. So the use of AI is increasing as the updates and overall development of technology continues.


Additional Resources

To learn more about AI including the Seeing AI app, visit the following pages. *Note* The Seeing AI app is not available in the UK app store at the time of writing, when it is I will be giving it a good run through. The Seeing AI app for iOS (external link). The Cortana website (external link). All about Siri (external link). All about the Google Assistant (external link).

The icing on the cake: The difference between AA and AAA compliance

31/07/2017


Introduction

Achieving a level of compliance for your app or website means that as far as the Web Content Accessibility Guidelines (WCAG) are concerned, your offering is accessible to as many user groups as possible who require assistive technology to get online. The term assistive technology, and even accessibility can mean different things to different people, and here at the Digital Accessibility Centre (DAC) we offer level AA and AAA accreditation for our clients depending on their requirements.

What do the different levels mean?

  1. Single A is viewed as the minimum level of requirement which all websites, apps, and electronic content such as documents should adhere to.
  2. Double A is viewed as the acceptable level of accessibility for many online services, which should work with most assistive technology which is now widely available on both desktop and mobile devices, or which can be purchased as a third-party installation.
  3. Triple A compliance is viewed as the gold standard level of accessibility, which provides everything for a complete accessible offering, including all the bells and whistles which make the difference between a very good experience and an excellent one.

In his post called Why do we need WCAG Level AAA? (External link), Luke McGrath points out that problems may occur and cause a failure for some AA criteria when attempting to reach AAA. Trying to meet AAA will mean that your website is the best it can be, however including the additional implementation may not be possible if budget is a concern, and working through a particular problem may push back a go live date if trying to fix AA issues when trying to move to AAA. A good example of AAA is found below, which highlights how AA and AAA can make the difference for end users.


One key difference between AA and AAA is for screen reader users when navigating the page. If a screen reader user is viewing a list of links and hears their software announce ‘click here’ or ‘read more’, it will pass as double A if the links are associated with each other in a paragraph or list. This means that the link would be surrounded by text like, ‘to read the DAC blog click here’, click here being the link. While it is possible to read the information using another method of navigation such as reading the entire paragraph rather than just a set of links, the link text would be ambiguous when moving through all the links to find the required content. So including the icing (a clear link text in this instance), would make the link easier to read no matter what method of navigation is being used.


As shown above, moving to AAA if at all possible will create the best experience for all users, however AA is accepted as a very good commitment to accessibility. For more information feel free to get in touch, or check the following link for more information. Web Content Accessibility Guidelines 2.0 (WCAG2 External link).

The update round up: what’s new in accessibility when the updates are released?

17/07/2017


Introduction

It’s that time of year again when we all look forward to the regular updates of iOS, Android, and Windows and wonder what changes are ahead when the new updates are introduced. What can we expect from the assistive technology though, and in particular, what improvements are the big players planning in relation to their built-in software.


The latest updates from Apple

iOS 11 comes with many exciting features, however the big accessibility improvements are the 1-handed keyboard, adding another feature to its feature-rich OS. Other offerings include automatic image scanning, where Voiceover, (the built-in screen reader on iOS), will attempt to scan an image for text and read it to the user. This combined with the same scan for unlabelled buttons makes for interesting developments. For low-vision users, a new invert colour option, and additional integration with third-party apps means that low-vision users are able to have better contrast across more applications.


MacOS Users who experience difficulty using a physical keyboard will now benefit from an on screen keyboard in the September update of macOS. The keyboard will allow users to customise it to their requirements, although like other updates we will need to wait and see what the final result will be. Many of us talk to Siri, but have you ever just wanted to type a message to Siri instead? Now you can, Siri will still provide audio feedback, just type what you want if you can’t chat with Siri. Improved PDF support relating to tables and forms with Voiceover is another feature in the new Mac OS, a feature which I am sure will be much welcome by Voiceover users when attempting to quickly access PDF and other documentation. Similarly to iOS, Voiceover on the mac will describe an image by using a simple keyboard command, making it possible to interpret your photos maybe, I guess time will tell. Better navigation of websites which now use HTML 5 is also included in the update, meaning that Voiceover will support the new standard and provide better navigation when tables are used in messages for example.


Apple watch is also benefiting from a software update, including the ability to change the click speed of the button on the side of your watch. This means that users who have difficulty double-clicking for example, can customise the click speed when they need to use Apple pay or other such services. Apple TV will now support the use of braille displays. A braille display is a device which translates the print material on-screen in to braille via Bluetooth or USB, allowing users to navigate and read content such as programme guides ETC.

Windows

Improvements to Windows Narrator, the built-in screen reader on Windows devices, will see the ability to learn what command is performed when using another device such as a keyboard, via device learning mode. Narrator users will be able to experience a clearer and more unified user interface (UI), as improvements across all apps and devices will make Narrator easier to learn and use. The scan mode used to quickly navigate a screen or web page, will be set to on by default, and it’s setting across multiple apps will be remembered to further improve the user experience. Narrator will also include a service which attempts to recognise images which contain a lack of alt (alternative) text, by using Optical Character Recognition (OCR) to identify the image.


The Magnifier will follow Narrator’s focus, to make it easier for users who use both Narrator and magnification simultaneously. The desktop magnifier will include smoother fonts and images, as well as additional settings and the ability to zoom in or out using the mouse. Also included for low vision users are new colour filters, which make it easier for persons who have colour blindness or light sensitivity to use a windows device.

Android

A new accessibility shortcut will be available for users running android o. The feature is set to toggle on and off Talkback by default, however it can be used to configure another accessibility service after set up, such as magnification or switch access. The shortcut can be performed by pressing the up and down volume buttons together on any compatible device, meaning that it will be easier than ever to get your required access option on Android O. When using Android o with Talkback, the addition of a separate talkback volume has been introduced to enable users to change the output volume separately from the media volume. For low-vision users, a new slider has been introduced at the top of the screen when media is encountered to easily perform the same action. So if listening to any media it is now possible to easily hear what Talkback is announcing. For devices running Android o with a fingerprint scanner, Talkback users can make use of customisable gestures which can be performed by using the fingerprint scanner on their device. To enable support for additional languages, multi-language support is another feature being developed for Android O, via Google’s text-to-speak software to detect and speak the language in focus.


When running an Android o compatible device, and having an accessibility service active such as magnification, users can implement an accessibility shortcut to magnify the screen when the Accessibility button is available. This means that, using the example of magnification, a user would be able to tap the accessibility button, and use a specific gesture to change the screen magnification. To return to the previous (or default) setting, all users need to do is press the Accessibility button again to remove the accessibility setting.


For low-vision users who may not require the features of Talkback, or for users who have dyslexia, select to speak will be a useful feature. Select to speak is a service which announces a selection of elements or text, and includes options to read by page, speed, and the previous or next sentence. As mentioned earlier, we will need to wait until the final updates are released in a couple of months, but the future is very interesting for built-in assistive technology.


Resources

To learn more about the latest updates, go to: The latest accessibility updates in iOS 11 from AppleVis (external link). The Microsoft Accessibility Blog (external link). The latest accessibility news about Android O (opens external link which contains a youtube video).