Reading level – accessibility for web writers

The accessibility guidelines require grade 9 readability for content. Otherwise we should provide supplementary content (visuals, for example), or an easy-to-read summary.

Benefits of more readable content

Unfortunately a lot of web content is unnecessarily dense and complex. While this guideline aims to help people with learning disabilities like dyslexia, writing in a more readable style benefits everyone.

The problem is, there’s no quick and easy way of measuring readability.

Readability testing tools

The Web Content Accessibility Guidelines suggest that readability can be measured using readability formulas. Two readability testing tools are built into Microsoft Word: Flesch Reading Ease and Flesch-Kincaid Grade Level. To use them, you’ll need to enable them and then run the spell-checker. When the spell-checker has finished, Word shows the readability statistics.

Example readability test

To meet the accessibility guidelines, content needs a Flesch Reading Ease score over 50, or a Flesch-Kincaid Grade Level below grade 10. Here’s an example, using content from the Centrelink (an Australian government agency) website.

Interpreter and translation services

To help customers understand Centrelink services, Centrelink provides interpreters at no cost to customers.

Where necessary to support a claim, Centrelink also provides a free translation service for customer documents.

Interpreters contracted by Centrelink are covered by confidentiality provisions and a Code of Ethics, which means customers can be reassured that any information learned through an interview conducted by an interpreter will remain confidential.

Bilingual staff may be available in some Centrelink Customer Service Centres to help with brief customer enquiries. If an interpreter is not immediately available, Centrelink staff may use a telephone interpreter service to assist customers.

I copied this text into Word and ran its readability tests. This showed a Flesch Reading Ease score of 16.6 (too low) and a Flesch-Kincaid Grade Level of 15.8 (too high).

Screenshot showing readability statistics from MS Word, discussed above
Readability test results

Retest after removing proper nouns

The guidelines let you remove proper nouns from content before testing. So I retested the content, using ‘X’ to replace the proper nouns and titles (so I was still testing sentences of the same length). The results: a Flesch Reading Ease score of 40.6 and a Flesch-Kincaid Grade Level of 12.5. The reading level is still a problem.

Screenshot showing readability statistics for retested content, discussed above
Readability test results: retested content

Rewrite with shorter words and sentences

To lower the reading level, the content needs shorter words and sentences. The example below has been rewritten and now has a Flesch Reading Ease score of 58.8 and a Flesch-Kincaid Grade Level of 8.3.

Free interpreter and translation services

To help you understand our services, we provide free interpreters. We can also translate your documents if you need to include them with a claim.

If our interpreters are not available we may use telephone interpreters. Some of our staff speak other languages and may also be able to help with short enquiries.

Our interpreters follow a Code of Ethics and must keep your information private.

Readability statistics, discussed above
Readability test results: rewritten content

Numerous problems with readability tools

While readability formulas claim to measure readability, their use is controversial. Why? They only measure two, or sometimes three aspects of content: word length (in syllables), sentence length, and some include a list of uncommon words.  Readability depends on more than these three factors.

So although you may need to use readability testing tools during an accessibility audit, you should understand their limitations.  Here’s a list of concerns.

  • Scores for the same text differ when using different readability tools.
  • All words of the same length are treated equally, yet ‘agree’ is probably easier for many readers than ‘concur’.
  • Shorter words are treated as easier words, but ‘abide’ is probably harder for many readers than the longer ‘tolerate’.
  • Shorter sentences are always considered easier to read. However, a sentence of 20 words is not necessarily easier to understand than one of 25 words. It depends on the sentence structure and style. These are not considered by the formulas.
  • Use of passive voice, double negatives, nominalisations, noun strings, idioms, jargon, unfamiliar abbreviations and other writing problems are not factored into the formulas.
  • Length, structure and layout of the content are ignored. Yet long, poorly organised content with few headings is likely to be less readable than well designed content.
  • The use of graphics to communicate or support text-based content cannot be measured by readability formulas.
  • The degree of difficulty of certain concepts or topics is not given any weight.
  • Readers’ interest and motivation are not considered, nor is their existing knowledge of the topic.
  • Formulas don’t seem to work properly when you use a lot of dot points or tables, which is often the case on the web.

Despite these problems, some argue that readability testing helps them get agreement that content needs to be rewritten. It’s likely that a high grade-level means content won’t be easy to read. However, the reverse is not true.

So if you use a readability testing tool, be aware of its weaknesses.

A better approach

A better way of making content easier to read is to use plain language. Plain language is more than choosing short, simple words.  It involves thinking about information design, writing style and the needs and skills of your target audience.

The plain language guidelines linked below include a range of writing and design techniques. And they recommend testing readability with users, rather than testing and tweaking to get the desired results from an algorithm.

References

Next article in this series