Friday, November 22, 2024

What Makes A Successful Remote Usability Test?

By Chris Capuano, Softrams Lead HCD Researcher

The simplest way to know if a product or service is working is to put it in front of the people who use it the most. Sitting down with end-users to test products is vital for designing intuitive user experiences. We’ll walk through a quick example of how we designed and executed a usability study for Centers for Medicare & Medicaid Services (CMS) Accountable Care Organization Management System (ACO-MS).

The Softrams team is a partner in building and maintaining CMS’s ACO-MS. The program seeks to provide high quality care to patients, reduce duplicative medical treatments, and pass the savings gained through efficiency to its members. The Softrams team was asked to redesign the ACO-MS knowledge library and helpdesk pages. To understand the challenges that existed with the current platform, we had several in-depth conversations with the ACO-MS sponsors at CMS who know the ACO program inside and out.

This leads us to our first tip: Understand the difference between your end-users and your project sponsors. End-users are the people who use the product day in and day out, while sponsors are the supporting cast that make a product or service come to fruition. Our CMS sponsors were interested in creating better help desk and knowledge library products for their ACO coordinators to reduce demand on CMS personnel. Our end-users’ motivation was the desire to find resources faster and reduce the number of times they needed to reach out to their coordinator.

Once we understood the motivations of our end users and sponsors, the design team generated personas to capture the basic details of our end-users. Using personas, we can frame the conversation with the design, development, and project management teams around the user while also reducing internal biases. From there, the team began exploring a first round of prototypes for the Knowledge Library and Help Desk pages. We demoed the prototypes to our CMS sponsors who encouraged the design team to begin testing with end-users. This brings us to our second tip: If possible, test as early as possible in the design process and work with your sponsors to identify test participants who represent your end-users.

Once you have a prototype ready to test, now it’s time to plan how you want to test with your end-users. I am a fan of balancing a test script with open conversation during a usability test. For example, plan to test the navigation area but don’t completely prescribe the questions you will ask. Instead, bring the user’s attention to it and ask them an open question about how they would complete an action. Continue to probe as necessary to uncover a design’s strengths and flaws.

Once you have a general plan laid out for what you want to test and how you plan to flow through it, now it’s time to reach out to your users. For this redesign, we worked closely with our CMS sponsors to provide us with names and contact information for end-users. It is best practice to work with your sponsors, especially in government projects, to provide you with end-users to test with. Once we received the names of test participants, we drafted a message that introduced ourselves, provided some context for the project, laid out expectations for testing, and offered a call to action to sign up for a testing timeslot. With a plan in place and test participants identified, your team should meet to conduct an internal dry run of the usability test in order to work out the kinks and ensure that you are testing everything that you need to prior to test day.

When test day arrives, talk with your team beforehand and identify who will be leading the test and who will be notetaking. Make sure that you show up to your meeting several minutes before the scheduled start time. Once all members of the test have joined the call, go around the room and have everyone introduce themselves. Reiterate the context of the testing and what it is that you will be testing. Mention that there are no wrong answers nor bad questions and encourage them to think out loud as they are using the product. It is also incredibly valuable to record these sessions so make sure that you ask your participants if they are ok with this.

During each of our testing sessions, we started off with a quick set of demographic questions. These included the user’s name, company/organization, role, location, and years of experience in their role. We also asked them questions about the current system – what works well and where there are opportunities for improvement.

From there, we began testing our new design. We sent them a link to a clickable Figma prototype, had them open it, and share their screen. We started by drawing their attention to certain elements that we wanted to test following our discussion guide. We asked them to complete several tasks including finding resources in the Knowledge Library and submitting a help desk ticket. As we observed them completing the tasks, we probed when things seemed difficult, asking questions such as “Was that how you expected it to work?” and “How can we improve this particular experience?” We always do our best to avoid leading questions and instead, we ask questions like, “How would you expect filtering to work on this feature?” Don’t lead or bias them towards an answer. Leave it as open as possible for them to provide their unbiased input.

The final tip that I would like to share is to debrief with your team immediately after each usability call to recap the biggest takeaways and findings as well as to organize your notes and key information to make the synthesis process easier once user testing is complete. I have found that conducting these quick touchpoints allows teammates to share information while it is still fresh.

Ultimately, usability testing is a conversation with your end user. During a session, you should remain relaxed and engaged to improve the product and experience for the person that you are testing with. Know the motivations of your key stakeholders, determine a product or feature that you want to test, identify end-users to test with, and design a discussion guide that helps you to uncover the strengths and flaws of your product or service.

[related-post]

LEAVE A REPLY

Please enter your comment!
Please enter your name here

FedHealthIT Xtra – Find Out More!

Recent News

Don’t Miss A Thing

FORUM Editor
FORUM Editorhttps://insights.govforum.io
Content Analyst for FORUM and Author on the Daily Take Newsletter for G2Xchange Health and FedCiv.

Subscribe to our mailing list

* indicates required