top of page

Usability in Complexity

Winter 2021
University of Washington HCDE Certificate Program
Seattle, WA

TraxSolutions: A participant tracking system that helps case works in health & human services maintain their client records. TraxSolutions has several functions; our project focused on the Case Management feature. 

design statement

jobs to be done

How does TraxSolutions terminology, and its workflows, impact new users’ ability to find and modify Cases?

Case Managers & Social Workers expect to be able to use TraxSolutions -- and, specifically, the Case Management feature -- to easily...

  • Find existing case records for their clients

  • Edit a client's case record(s) to track their progress

  • Close a case records when it is no longer needed 

Our study was designed to identify the most severe usability issues hindering successful completion of these key Case Management tasks.

stakeholders

TraxSolutions is used by case managers and social workers in health & human services. They use it to keep records and track goals/progress for the people they serve. 


This study was conducted by myself and two classmates as part of our Usability Studies course at the University of Washington.


My team was connected with the creators of TraxSolutions via our course, and we spent the semester working them to plan & conduct an in-depth study of their system. We presented results of the study the creators of TraxSolutions and to our professor/peers. 

Jump_Before_FINAL.gif
Jump_After_FINAL.gif

process

  1. Discover: We began by meeting the creators of TraxSolutions. They gave us an overview of their system, including major questions and/or problem areas they wanted addressed through our study. 

  2. Explore: Next we spent one week planning & conducting a Heuristic Evaluation of the Case Management feature; our team of three tried as many tasks as possible using the feature, and recorded a list of any (potential) usability issues that we found in the process. 

  3. Scope: Next, we used the list of usability issues from our Heuristic Evaluation to identify the workflows that we anticipated could be the most problematic for TraxSolutions users. We worked with the TraxSolutions team and scoped these down to six workflowes that we would focus on for the remainder of the study.

  4. Planning: We planned a study using a blend of quantitative and qualitative methods. Our primary research method was a moderated, remote usability testing. Planning for this required our team to create: An interview script that we would use to run test sessions uniformly, a corresponding data log that we would use to record test results & analyze results across test sessions, an issue severity scale that we would use to uniformly rate issues during & after sessions, and a set of criteria for recruiting interview participants.

  5. Recruitment: Our chosen platform for this study was UsertTesting.com. We used their recruitment tools with our participant criteria to select eight study participants. 

  6. Testing: At last we ran our studies (again, using UserTesting.com). Here we met a few of the messy realities of running a study with real users: 2 of our recuits were no-shows to their sessions, and two had passed our recruitment criteria but, once speaking to them, we realized they were outliers to our target group -- a pharmacy student & life coach, which have slightly different needs than social workers / case workers. Still, we were able to run successful sessions that resulted in a rich dataset of quantitative and qualititative results. 

  7. Analysis: After running test sessions, we spent a week on pure analysis of our results. Analyzing our data log revealed a set of findings which we extracted, prioritized & prepared for presentation. 

  8. Presenting our Recommendations: Finally, we presented our results, outlining key data/findings from the sessions that we conducted. We also presented these results in the form of recommended design changes -- changes which, based our study, would help improve the overall usability of TraxSolutions and the satisfaction of its users. 

The TraxSolutions team were grateful for the depth of our study and found our insights enlightening. In particular, they appreciated that we were able to identify high-impact opportunities for improvement that were very specific and easily achievable for their design team. For example, instead of suggesting broad changes to the information architecture of their system, we suggested ways of adjusting wording or adding visual emphasis to help users discover difficult-to-find items.

key learnings

What went well?

  • Defining clear scope for our research, including specific tasks & workflows within the Case Management feature (TraxSolutions is a complex system, so scoping was challenging but important!)

  • Creating a detailed, uniform data template for recording session results -- this made it easy to record findings during sessions, and to analyze them after. 

What would our team have changed if we were to run this study again?

  •  We would define a more specific recruitment criteria (“counselor” was more subjective than expected!), and/or add a step in our recruitment process to confirm participant fit. The two outlier participants were able to provide useful insights for basic usability tasks (e.g. finding a button on-screen) but provided less reliable impressions of complex tasks (e.g. offering feedback on domain-specific language that would make sense in their work). 

Design PrincipleS 

DESIGN SKILLS

bottom of page