Personalised technical accessibility training: a case study

At: CSUN 2020

By Allison Ravenhall

Digital Accessibility Sensei

Twitter: @RavenAlly

12 March 2020

A different take on a11y training

This talk contains 6 stages:

  1. Players
  2. Problems
  3. Proposal
  4. Plan
  5. Execution
  6. Outcomes

Players

Who was involved in the training program?

Coles

Coles is one of the “big two” grocery retail chains in Australia. It has over 100,000 employees, over 800 grocery stores and the largest online grocery presence in Australia.

2014: Gisele Mesnage sues Coles

In 2014, Gisele Mesnage sued Coles because she could not use their online shopping site with her screen reader. Most disability discrimination cases in Australia are resolved at the mediation stage through the Australian Human Rights Commission. In this case, mediation was not successful and the case went to the Federal Court. The case was settled out of court. Part of the settlement was an ongoing undertaking by Coles to improve the accessibility of their website. Source: bit.ly/abc-gisele-coles

2017: Coles launches Quiet Hour

I’m happy to report that Coles has completely turned around its attitude towards accessibility. They now several full-time accessibility staff, they have a panel of external accessibility partners, and Coles Online won “Corporate website of the year” at the 2019 Australian Access Awards. In 2017 Coles launched an initative called “Quiet Hour” where all stores reduce the lighting and background noise in their stores for at least one hour per week to make shopping less overwhleming for people with sensory sensitivities. Source: bit.ly/abc-coles-quiet-hour

The Coles Digital team

The Coles team has one full-time accessibility manager and 2 accessibility analysts. The Coles app team receiving the training includes 1 business analyst, 2 iOS developers, 2 Android developers, and 2 testers.

The Intopia team

I (Allison) was responsible for designing and delivering the training. My colleague Ricky had the role of content creator.

What are the problems with traditional full-day classroom training?

Participant feedback #1

“A whole day of training is too long”

Participant feedback #2

“Not enough time for activities”

Client feedback

“High quality content, low recall”

Trainer observation #1

Managing different personalities

Trainer observation #2

Wide spread of a11y knowledge

What was the proposal that Coles put to Intopia?

Train the Coles mobile app team

Deliver a different style of training.

  • Multiple, shorter sessions
  • Role- and technology-specific content
  • 1 on 1 or small group delivery
  • Create reference materials that would “live on” after the sessions were delivered

How did we plan to deliver the training?

Step 1. Create a modular syllabus

For each job role (business analyst, developer, tester), I drew up a very high-level, modular syllabus. Only 6 or 7 items per role, like “What is digital accessibility?” and “Component-level requirements”.

Step 2. Flesh out each syllabus item with topics

For each module, like “Component-level requirements”, I created a detailed list of topics that I wanted to cover. The screenshot on this slide contains around 30 of the topics discussed in the “Accessible development - All platforms” module.

Step 3. Create a project plan

Once the modules and topics were set, it was time to plan out the session delivery timeline. Most participants received 2 sessions per week for about a month. It took me around three months in mid-2019 to deliver the program from start to finish.

Step 4. Create lesson plans

To me, the lesson plans were the core of the entire program. Each lesson plans contains these sections:

  • Goals
  • Attendees
  • Prerequisites
  • Equipment
  • Concepts, needs, techniques
  • Notes
  • Run sheet

The Concepts, Needs, Techniques section was the major portion of this 4 to 6 page document. For every module topic, I described the topic in plain language. For example: “Focus styling needs to be visible and defined for all enabled interactive elements. I then explained the human impact of this concept. Example: “Sighted keyboard users need visible focus styles in order to understand where their cursor is located on the screen.” Finally, I explained the technique, specific to the job role, that the trainee should use to satisfy the concept.

Step 5. Create reference materials

Ricky took the concepts, user impacts and techniques from my lesson plans, and added more context to create reference materials. He linked all of the concepts back to their related WCAG 2.1 success criteria, the Understanding and WCAG Techniques documents. He then scoured the Internet and curated a small set high-quality, recent blog posts, articles, platform references, videos, or code samples for each concept. It provided a lot of background information and context. It basically made me redundant! The reference materials contained a short summary of each item, a word count or duration.

How did we execute the plan?

First up: A classroom session!

Despite wanting to move to small-scale sessions, the kick-off was the whole group. We spent two hours discussing introductory accessibility concepts, Australian laws and standards, personas and characteristics.

Second session: VoiceOver demo and Q & A with Scott Nixon

The second session was another two hours with the whole group. I invited Scott Nixon to come to Coles and show them how he uses their app with VoiceOver on his iPhone. The photo of Scott on the slide is from our 2019 Australian accessibility conference, A11y Camp. Scott’s talk was called “Digital Ableism: Real, uncontrolled and easy to fix”. During the session at Coles, Scott performed a few tasks similar to a traditional usability testing session, the important difference being that the observers were in the room with him. I encouraged the trainees to ask him about the features he preferred, the techniques he used. He shared his experiences of assistive technology and digital content as a blind guy. After the program, the trainees unanimously said this session was their favourite. It enabled them to appreciate the difference that “doing things properly” could make.

Solo / 2-person sessions

After the two group sessions, it was time to switch to the 1 on 1 sessions. I trained the two testers together. They asked for it and it provided an interesting counterpoint to the solo sessions. Most of the trainees went through the following sessions.

  • Assistive technologies
  • Site-level principles
  • Page-level principles
  • Component-level principles
  • Tools
  • Ask me anything, wrap up, retro

Statistics

The program took Ricky and I around three months to deliver I created 18 unique lesson plans I delivered 40 2-hour sessions Ricky created 200+ pages of reference materials

Outcomes & reflections

Business analyst feedback

Two full days of training, but spaced out in a timeline that allows me to actually absorb it – that was great.

iOS developer feedback

I don’t fall asleep even though it’s an afternoon session so it must be engaging! I haven’t been bored, that’s where one-on-one helps.

Android developer feedback

Scott’s session was so helpful, was very interesting to see the impact when we don’t do our job well.

Tester feedback

Prefer training in pairs, hear different voices, sometimes I get a break, he can answer when I don’t know the answer. Not more than 4-5 people, would otherwise lose the opportunity to ask questions and get personal space.

Coles post-program review feedback

Participants have a passion for accessibility, which is an ongoing shift in attitude and behaviours. This is also changing their team culture.

No one used the reference materials

This was not surprising, but certainly the most disappointing outcome to me. When I asked the trainees at the end of the program where they would go to refresh their knowledge in future, they all said they’d Google the topics that we’d discussed. This is concerning because Google can often point to outdated or invalid advice.

Not enough activities.

Since it was a pilor program, most of my time was dedicated to the lesson plans and session delivery. If I had my time over, I’d like to add most interactive activities to the 1 on 1 sessions. They were very chat-heavy and I felt it was a little abstract at times. Including more hands-on activities would give trainees the chance to practice and understand concepts in a more realistic context.

My time and their money

My typical classroom training is billed as five days of work - three days of preparation and customisation and two days of delivery. This mentoring style of training delivery is a lot more expensive, both in terms of time spent and cost to the client. Given it was a pilot, there was a lot more upfront cost that would not repeated if we ran the training again.

What next?

On the whole, Coles was very satisfied with the training outcomes. Trainee attitudes were excellent and knowledge retention was improved. They are interested in running the training again for other internal teams. Another thing to consider for this training model is scalability. Can it scale? I think it’s limited. While I would be comfortable scaling to small groups, perhaps 5 trainees in a room, going much further means we’re back in the classroom scenario that we were trying to avoid. Alternatively, the trainer is running very many duplicate sessions which could lead to burnout.

Thanks for reading!

I am @RavenAlly on Twitter

Contact @Intopia on Twitter

Visit Intopia on LinkedIn

Our website is intopia.digital

Email Intopia at hello@intopia.digital