What accessibility means
In September 2020, digital accessibility was signed into Irish law, in line with the EU’s Web Accessibility Directive. But the importance of providing accessible online experiences was on our radar before then too. Accessible online experiences enables all users to access the information they seek regardless of ability.
This means that anyone:
- using a screen reading device can hear the content they need and navigate to the next page
- relying on captions on videos can read this content
- can understand the information we provide, regardless of nationality
- can access our content on their desired device
Enhancing accessibility means we are enhancing our user experience for everyone.
To ensure that our online information is accessible to as many people as possible, we carry out regular reviews and audits of our webpages. These reviews help us find colour contrast issues and any issues with hyperlinks. It also helps us identify any issues that might be making our content hard to understand or operate. In this way, we strive to design inclusive experiences.
The focus on inclusive design makes working on the digital team very rewarding. I consider myself lucky to contribute to projects that focus on making health information as digestible as possible, excuse the pun... It’s also great to work on a team that is driven by the same goal.
How this is done
Carrying out an audit of the HSE website may seem like an impossible task because of the amount of web pages it has. It certainly would be if we had to audit every single page manually.
To catch as many issues as possible we carry out a number of different types of audits. We use a combination of automated and manual accessibility audit tools and processes. The reason we use a combination of both is because automated tests only catch between 30 to 40 percent of errors on a webpage. Automated tests however, are more efficient than manual testing as they are able to review a lot more pages.
Automated accessibility audit tools
There are lots of different types of automated accessibility tools. We use a tool that generates accessibility reports for us. The reports highlight different issues present on different pages, and give us fix suggestions for these issues. The most common issues we encounter are issues involving images or hyperlinks.
We also sometimes test with other automated testing platforms. This is to ensure that we have not missed out on any potential errors that our main tool may not have picked up on.
Manual accessibility audits
Running manual tests can take a bit more time. This is because of the number of different types of tests that need to be executed, in order to pick up on different types of accessibility errors.
The types of manual tests we run are for identifying issues with:
- colour contrast
- keyboard accessibility
- content at different levels of zoom
Colour contrast issues
Colour contrast issues can make content difficult to read by:
- users who are colour blind or who may have low vision
- users who are accessing information in different environments, such as outside on a mobile on a sunny day where the sun is causing sun glare on the screen.
For these reasons, even when the automatic tests do not report on colour contrast issues, we still check that this is not an issue. We make use of a number of colour blind simulation web browser extensions. If we discover any issues we use colour contrast checking tools, to fix the issue.
This is an example of bad colour contrast usage on our website. The white text on the green background is difficult to read by some users. The white writing on the images is also difficult to read by some users.
Example of bad colour contrast on the website.
This is an example of good colour contrast usage. The white font on the coloured background is large and strong, and with sufficient contrast to the background colour.
Example of good colour contrast usage on the website.
An example of a hyperlink text issue includes using the same link text, such as ‘Click here’ to link to different pages. Links of this type can cause confusion for users on screen reader devices who will hear the same link repeated. These users will not have any idea as to where a link will bring them. When we encounter these kinds of link issues, we try to use the linked page title as the hyperlink text or be more descriptive with our link text.
This is an example of bad hyperlink text. The link text is not descriptive enough.
Example of non-descriptive link text on the website.
This is an example of good hyperlink text. The link text is descriptive.
Example of descriptive link text on the website.
Alternative text issues
Alternative text is a written description of an image that is added to an image tag. It enables users of assistive technology to get the same understanding from an image as users who do not use assistive technology. It can also provide context to users in rural areas when an image cannot load due to lack of bandwidth, or because of a weak connection.
Images with missing or bad alternative text can cause confusion for some users, particularly those on assistive technology.
An example of good alt text for an image of acne would be: “red acne spots on white skin”.
Keyboard accessibility issues
Although the majority of visitors to our website are mobile users, we still need to account for those that visit on larger devices. For example, some users might have motor disabilities and need the keyboard to navigate the website. If interactive elements are not highlighted when a user tabs to them, then they are not keyboard accessible. Users relying on this type of navigation will not be able to use our website. If we come across these issues then we need to make sure that we have our focus states activated. Our design and development team look after this.
Content at different zoom levels
Sometimes users with low vision need to zoom in on content to be able to view it. To ensure that there are no issues with accessing our content at different levels of zoom, we carry out 2 tests:
Firstly, we view our content:
- within different browsers
- at different screen zoom levels, from 100 percent (default zoom level) to 300 percent
Secondly, we set, within the browser settings, our font size to the largest setting.
Testing our content at different screen zooms and font size settings allows us to check:
- that there are no elements that overlap our content at various zoom levels, such as buttons
- for any issues with line heights or character spacings as a result of enabling larger font settings
This is an example of issues with line height when a user enables the largest font size setting.
Example of non-dynamic line height affecting readability.
This is an example of our COVID-19 webpage that does not have any issues when a user adjusts the zoom settings.
Example of dynamic line height with no readability issues.
It is important to note that when we are testing pages manually, we test within different browsers. This is because there are nuances between browsers that affect how users of assistive technologies access content. For example, accessing content in a Google, Safari or Firefox browser.
Manual accessibility tests with assistive technologies
The main assistive technology that we test with are screen readers. Screen readers read aloud the contents of a screen, webpage or document to primarily blind or low vision users.
The screen readers we use in our accessibility testing are:
- NVDA - a free screen reader technology for PCs
- VoiceOver – a native screen reader application on iOS devices (this includes Macs, iPhones, iPads and smart watches)
- Talkback - a native screen reader on Android devices.
Unfortunately, like web browsers, there are nuances between how assistive technologies parse webpages. Testing on more than one tends to pick up more errors.
People on screen readers navigate websites in much the same way as people who do not use screen readers. Sometimes they scan content, by parsing the different headers. This allows them to skip to the relevant section more efficiently rather than hearing the entire page read aloud. Sometimes they navigate by jumping from link to link. This is why it is important to ensure that our webpages:
- are properly structured, with the relevant tags in the markup, and
- that our link text is as descriptive as possible - sometimes users only hear the link text, and have not read the passage surrounding it.
For these reasons when we are testing on screen readers, we parse pages in 3 main ways:
- navigating pages using links,
- navigating by landmarks, such as headings and lists,
- parsing the contents of the page from top to bottom.
The most common issues from these tests are relating to link and alternative text. Sometimes we come across issues with interactive elements. For example, buttons that are not screen reader accessible.
Some users navigate websites using speech, for example users with motor disabilities. Speech is also great for users who may have a temporary disability, such as a broken arm. We test our webpages with Speech Recognition. This native speech tool on PC relies on voice to interact with pages. It enables us to ensure that our link and interactive elements, such as buttons and form fields are accessible to everyone. Sometimes we find elements that are not accessible by voice commands and these need to be made voice accessible.
Our commitment to accessibility
Ensuring our website is as accessible as possible involves a lot of work and a shared commitment across our teams. It is a multi-disciplinary process that involves continuous engagement with our various stakeholders. We are always interested in improving our online experiences. We are committed to this by partnering with experts and expert organisations. We attend workshops and webinars to upskill and educate ourselves in this area.
If you have any comments or suggestions on how we can improve, we would love to hear from you. Send us an email at firstname.lastname@example.org.