The process
No more spilt milk
An example project helping new baristas learn how to correctly clean the steam wand on an espresso machine.
I had a lot of fun creating this.
Try the experience for yourself or read on to explore the ADDIE process I used to create it.
Analysis
-
The owner of Karma Coffee has noticed an increase in negative online reviews over the past six months.
To understand this problem, I read the online reviews, conducted interviews with baristas and observed on-the-job behavior. Through this investigation, I noticed a common theme: uncleanliness, particularly in relation to the espresso machine.
When I suggested this to the owner, she acknowledged that some new baristas had been on-boarded quickly and may not have had sufficient training.
-
I proposed creating a series of scenario-based eLearning modules so staff could learn when and how to clean the espresso machine.I noted the benefits of eLearning in this context: flexibility, asynchronous learning, cost savings and a low-risk context (mistakes made by users being a learning opportunity rather than a poor reflection on the business).
However, eLearning is not an end in itself, and improvements must be seen in the real world.
As Karma Coffee already had business metrics in the form of online reviews, I proposed measuring the success (or otherwise) of the training by comparing reviews pre training with reviews three months after the roll out of the training. As a concrete goal, I suggested we aim to reduce the number of negative reviews from 25% (pre training) to <5%. We defined ‘negative reviews’ as those of two stars or less.
I also suggested including reviews in the eLearning as a feedback mechanism so users would be able to see the real-world impact of their work behavior.
The client accepted my proposal.
To understand the cleaning requirements, I contacted an SME at the espresso machine company.
During our conversations, I came to understand that cleaning tasks could be neatly separated into three categories: per use, per day and per week. We decided that the training could be delivered in three separate modules dealing with these three separate cleaning stages.
No more spilt milk is Module 1, dealing with per-use cleaning.
Design
-
I used Miro to create an action map, which allowed me to center real-world actions (in orange) that baristas would need to take to reach the overall goal (in yellow).
The elements of the action map are detailed below.
Goal (in yellow): reduce the number of negative reviews (two stars or less) from 25% to <5% three months after the training is rolled out.
Real-world actions (in orange): Clean when appropriate (after each use); clean as appropriate (wipe the steam wand and purge the remaining milk).
In-module activities (in green): Through the course of a first shift (consisting of four customers), the barista must decide (on three of four occasions) when to clean the steam wand and must clean the steam wand correctly (on three of four occasions).
Information (in teal): The user has an in-module mentor (Jess) who works at Karma Coffee. After the first customer, Jess shows the barista the equipment needed to clean, when to clean, and how to clean. This information is then available to the learner (on an opt-in basis) at relevant times throughout the rest of the module.
The action map was a useful way to visually communicate the design of Module 1 and was approved by the SME and client.
-
Once the action map was approved, I started work on a storyboard.
The scenario for this module is a new barista working their first shift at Karma Coffee. During the shift, the barista must serve customers and complete required cleaning tasks.
The user’s in-module mentor is Jess, who appears three times in the scenario: at the start (to welcome the new barista), after the first customer (to show the equipment and how and when to clean the steam wand), and at the end (to review the shift and farewell the barista). Jess also appears on occasion to provide feedback to the user.
The information that Jess provides is available on an opt-in basis throughout the module via a red icon.
To progress through the module, the user delivers the coffee (customers 1-4); decides whether cleaning is required (customers 2-4); cleans the steam wand (customers 1, 2, 4); and reads the customers’ reviews (customers 1-4).
I created a branching scenario after Customer 2 in which the user is able to make the wrong choice (to not clean the steam wand when it is required). The user later learns the consequence of this choice: a negative (two-star) review from this specific customer and a 25% negative review rate at the completion of the module.
I shared the storyboard with the client and SME who approved.
-
For visual design, I turned to an image repository and sourced vector images from an illustrator whose work suited the scenario.
This illustrator has a large body of work which allowed me to source several different poses for the main character, a number of different characters to serve as customers and the ability to visually expand the cafe space.
This illustrator had also formatted their illustrations in a way to make them easily editable. Using one illustrator for characters and background allowed me to fulfill the requirements of visual consistency throughout the scenario.
Of course, the images came with their own color scheme, which I changed to align with Karma Coffee branding: bright oranges and blues paired with brown and tan which, with the illustrations, give a pop-art feel to the project.
Similarly, I matched the font in the module with the font used by Karma Coffee. For the text-based font in the eLearning, I chose Ariel, which is simple and readable for users.
Although I sourced most images from one illustrator, there were some I needed to find elsewhere; namely, the types of coffee, the damp cloth, and the steam effect. To align the style of these images with the main images, I needed to ‘flatten' them and apply the scenario’s color scheme.
To edit the images, I used Adobe Illustrator, which is an easy and intuitive tool.
The visual elements of the module are a powerful communicative tool. As one example, the simple inclusion or exclusion of the barista’s apron functionally communicates his state of working and not working.
I also used Illustrator to edit the background scenes—including creating the illusion of a larger café at the start and end of the module (when Jess welcomes and farewells the barista). I also cropped characters’ images to use for reviews, the help icon, and the ‘thinking’ slides.
I also realized that I would need to significantly edit the espresso machine. Originally, there were three group heads and the steam wand was positioned extending from the machine. Unfortunately, however, this set up meant that there was very little contrast between the dirty (white) steam wand and the (white) wall background in the original image.
To ensure that the steam wand would contrast adequately against its background in both its clean and dirty states, I entirely removed one group head and moved (to its position) the steam wand and steam control. In its new position both the clean (black) and dirty (white) steam wand contrasts well against the back of the espresso machine.
The client approved the color theme, fonts, scenes and characters.
Development
-
I moved on to development of No more spilt milk using Articulate Storyline 360.
Storyline is a fun and easy tool to work with, and the issues were mostly minor and/or easy to fix with a bit of lateral thinking.
I learnt Storyline using online videos, prioritizing videos made by creators (rather than Articulate) as creators have greater freedom to be upfront about problems in the tool, while Articulate may be more inclined to obscure or downplay problems.
The first advice I was given was to avoid using Articulate’s built-in states—which proved to be invaluable advice later on when I was troubleshooting issues resulting from my (inadvertent) combination of custom and built-in states.
Due to this advice, I naturally custom-built throughout the module. The only exception being a built-in ‘pick one’ interaction in which the barista identifies types of coffee. However, this interaction proved to severely malfunction during browser testing and had to be entirely replaced with a custom-built interaction.
Custom building is really straightforward with the main things to keep in mind being design and consistency.
As I custom built interactions, I also had to custom build feedback for answers and completed actions. Feedback occurs in several ways throughout the module: via dialog, text, and also just through progression through the module.
However, during user testing I discovered that one user was not able to clearly identify when they had completed the act of giving coffee to a customer. From this, I realized that some feedback needed to be supplemented. To do this, I added a ‘ding’ sound to accompany the giving of the coffee and later added this sound in several other places in the module.
-
There were a few navigation issues to troubleshoot.
One involved the repetition of animations when users returned to a slide after accessing Jess’ help button. This can be jarring for users, can appear unprofessional, and wastes time.
To fix this, I added another layer to the same slide (which users were directed to after accessing the help button) that showed the same content but had no animation effects.
With the Go back button, my aim was for the module to be navigable from start to finish and back again (excluding the final slide).
The issue with this aim was that some slides require interaction prior to buttons appearing (e.g., ‘Move the coffee to the customer’), while others progress with triggers other than the Continue button (e.g., ‘Read the customer’s review’).
I didn’t want to change this, though, as the former is important for user interaction while the latter adds a nice element of variety. Instead, I decided that it was not too burdensome for users to (re)perform a simple action before being able to select Go back, and I decided to (backwards) skip past slides that did not feature a Continue or Go back button. If users want to return to a specific slide, they only have to wait one extra slide to do so.
The last issue was slide numbers. As I duplicated, deleted and moved slides during development, it became apparent that the functional order of slides did not always align with the slide numbering. For example, the Continue button on Slide 1.46 progresses users to Slide 1.33. This is correct in function, but looks incorrect in terms of slide numbering.
I realized that this would not be a problem but decided, nonetheless, to make the correct navigation explicit throughout the module. In practice, this meant replacing default ‘jump to next/previous slide’ triggers with, for example ‘jump to [specific slide number]’ even in cases where ‘jump to next/previous slide’ was correct.
Implementation
-
I engaged people ranging in age from nine to 75 to do user acceptance testing for this module. I gained a great deal of valuable information from this and changed several aspects of the project as a result.
Adults (75 + years): I realized that I had unreasonably assumed knowledge of the equipment, specifically the identification and name of the steam wand and steam control. From this insight, I added five slides to the project which showed and then checked the user’s knowledge of this equipment. I also added this, as a graphic, to the information that Jess provides to users.
I noticed some hesitation when following instructions, which I resolved by aligning instructions more to the device-based actions users needed to take. For example, I changed ‘Give the cappuccino to the customer’ to ‘Move the cappuccino to the customer’. (The instructions evolved to ‘Give the cappuccino to the customer’ as the user moved through the module and became more familiar with the tasks required).
I was told that the text on the menu board was too small, even when enlarged, so I increased the size of the font by 100%.
Lastly, one user wasn’t able to easily find Jess’ help button. To make this clearer, I made the icon stay in position even when the user selected the icon. I also added more explicit (and visual) instructions to orient the user when the icon is first introduced.
The adults (44 and 47 years old): I noticed that one user was unclear as to when an action (giving coffee) was complete. To show the completion of this action, I added a ‘ding’ sound to coincide with the action. I later added this sound effect in other places where I thought extra feedback was necessary.
As previously mentioned, I noticed again some hesitation when following instructions, so made some instructions more aligned with the device-based actions the users had to perform. In this case, I changed ‘Look at the menu board’ to ‘Select the menu board’.
Lastly, one user told me it was not always clear what was (and was not) a button because a few different elements had the same colors: brown lettering and beige fill in a rounded rectangle shape. Although these colors were reversed on the hover state, to see this the user had to hover first.
To resolve this, I reversed the colors on buttons throughout the module so the ‘normal’ state for buttons was brown fill and beige lettering; while the ‘hover’ state was beige fill and brown lettering. This made it much easier for users to identify buttons.
The children aged nine and 13: As native technology users, both approached No more spilt milk with confidence and had little trouble understanding what to do and/or figuring out what to do even when it wasn’t explicit.
They enjoyed the interactivity of cleaning the steam wand and were more enthusiastic than the adults to deliberately make the ‘wrong’ choices.
One user tried to select the (non-active) Change button in the user instructions on the first slide, so I reduced its size and put a black outline around it to make it clear that it was not a button.
In general: There is a wealth of information to be gained in user acceptance testing. It was extremely helpful and interesting to watch these users move through the module, and I will certainly take advantage of similar opportunities moving forward.
Of course, user acceptance testing is not always possible to do in person, so I’m excited to utilize some of the more technical ways to observe user behavior while doing eLearning modules.
-
I tested No more spilt milk on Chrome, Edge, Safari and Firefox web browsers.
The only issue that came up during testing was the severe malfunctioning of Articulate’s built-in question interaction (during which users identified different coffee types).
To ‘fix’ this, I removed the built-in interaction and custom built it again. After this replacement, the module worked as designed across all browsers.
Evaluation
-
Three months after the roll out of the training (Modules 1, 2 and 3), the client and I revisited our overall goal of reducing negative online reviews from 25% to <5%.
As we started with this goal, we were easily able to evaluate whether or not the goal had been achieved (and, thus, whether or not the training had succeeded). To do so, we calculated the percentage of reviews written in the last three months that had two stars or less and found that it had reduced below 5%.
The client was satisfied with this outcome and we discussed working together on future training projects.
-
I learnt an incredible amount over the course of developing this module both in terms of technical skills and the application of instructional design theory.
If I had had more time to work on this project, I would have loved to incorporate more branching scenarios.