Leadership, Crisis Management, Design Sprint, Moderated Research, Client Management, Hi-Fidelity Mockups, User Testing
Web, Tablet
Testing Script, Quantitative + Qualitative Test Results, Hi-Fidelity Mockups, Style Guides, Pitch Deck, Rolling Implementation
View the design process and deliverables below.
For purposes of privacy, we will call the client FoodCorp. This was a large client during my time working for the Human Resources SaaS Platform, TalentGuard. FoodCorp had a large, distributed user base of part-time and seasonal workers who would staff restaurants and service industry jobs. Additionally, FoodCorp had a small percentage of workers centralized in their Buffalo, NY, headquarters.
FoodCorp's headquarters users were staffed with Administrative and central operations management. These were first round implementation users, with the large percentage of distributed service workers to come in later waves. For purposes of our design sprint and subsequent moderated user testing, our focus was on the larger array of service workers.
Our small Product Team at TalentGuard had started our last major Sprint Epic of 2019. After our initial problem-scoping and research-generated content, we were given direct work from organization leadership that one of our top clients were unhappy with how the product addressed usability and user-orientation. This threatened the current state of our relationship with FoodCorp.
The result was for our leadership to implement a walkthrough experience for novice first-time users, that would be mobile friendly and scale across our service.
There was no previous precedent of Walkthroughs in our product. Nor would a 3rd-party widget suffice based on tech stack requirements.
Our team was given a two-week deadline.
One week to pivot from our major epic to address FoodCorp's user needs in a Hi-Fi test-ready design. One week to develop a metrics to measure task-completion, informed navigation, and ease-of-use.
It was an incredibly exciting time, and I believe this two-week challenge is what solidified the teamwork among the Product Team, allowing me to demonstrate my leadership in problem-solving.
Once the challenge was handed down from leadership, I brought together my Product Manager, my Design Manager, and Technical Writer. We blocked off the schedule for our War Room, starting Monday, with whiteboard paint on both sides of the walls, and established the following:
1) Decision-making protocol for unanimous, actionable decisions.
2) We work backwards to determine our individual commitments from our Friday deadline for a testable prototype.
3) We need to understand our principles for what a well-designed walkthrough provides a user. Understand how the different functions in our product bring value. Bring users to understand why they are in a specific location in our software. Orient users to how the experiences they may have in one location relates to the experiences they may have in other places of our software.
A messy whiteboard brings about a Research Plan consisting of:
Our Client & CTO suggested we solve this first-time user problem with a Walkthrough.
But what is a Walkthrough?
What makes a good Walkthrough?
What makes a Walkthrough scalable?
We needed to deliver a walkthrough experience to attenuate novice users the most common areas this client used.
We agreed that Success Criteria included necessary user walkthroughs for:
10 areas of the UI, the walkthroughs must Generalize to all current Clients, and it must have functionality to scale to other product areas for maintenance and growth of the UI.
To align my team, we proposed our Walkthrough to answer 3 primary values as to what makes a "good" walkthrough:
Why am I here?
Where am I?
How does this experience relate to others I might have (in this software)?
Then, we Ideated on what UI elements and Content might satisfy each of the 3 Walkthrough Primary Values above, for each of the 10 areas of the UI we needed to walk our users through.
We Affinity Diagramed our individual results to define common threads of thought. Then, identified which was the summative Category of the thread.
Finally, we Voted for Main Areas of Focus based upon those Categories.
Now we were ready to Concept on the Form of the Walkthrough.
From here, we knew why were were doing what we were doing and where.
Ultimately, the experiences the users were to have, as we hypothesized, were laid out.
Now, all that was left was to map the info architecture of the content and form of the experience.
How do we think of Hierarchy?
When a Tour Guide walks you through an experience, an exhibit, an individual painting, a city block----what does it feel like to be situated to your context?
When it is done well, doesn’t the history, reference, and ultimately meaning feel clear, although it was alien just moments ago? This was the analogy for our North Star regarding our Content Heirarchy structure.
Applying the heirarchy described in the image above resulted in whiteboard markers over every wall of the design War Room. This material became the skeleton that would support the visual design.
The form of the design didn’t come after our information architecture ideation sessions, but alongside them.
What walkthroughs do we like?
What exemplars are out there?
How might this user access this thing?
What would that feel like?
How does it feel to be greeted here?
After we documented our Content Architecture, we had enough information to create a testing script based on the primary user goals we knew our users would go to a page in the UI to accomplish.
Each of these goals included task-completion information, tracking of errors, as well as attitudinal information from 7-point likert-scale reporting by the user. These documents were produced by myself, and moderated by myself. Tracking of errors/missed-clicks were observed and recorded by my colleagues.
Writing Testing Scripts that explored a variety of variables for evaluation:
After 4 Days of preparation, design work, research scripting, and communications with our client, we married our lo-fi sketches and Content Architecture into High Fidelity interactive prototypes. This was weekend work in anticipation of the flight out to our client on Monday.
All-in-all, we were prepared for testing:
Demonstrating design usability that works and persuasion are two different skills. There is theory (usability) and there is business (persuasion).
In our 4 day testing, one challenge we encountered was that 13 of the vetted users were client stakeholders in the corporate structure. Several were leadership. These were not the users of the system or designs were were testing.
What became clear by Day 3 of testing is that the reputation as a client partner was in focus. After all, this product is what our client's users leveraged in order to help map out their careers as hourly-wage workers for a corporate staffing company.
The light at the end of the tunnel of testing was the hope and reaction gained by the 3 "real" users who these walkthroughs were meant to empower. Their feedback both validated the design direction from the quantitative standpoint, but also gave shape to the need for an easy-to-understand tool to help them manage career growth in their hourly-based industry. An industry conventionally taken for granted as high-turnover with little guidance up the ranks.
By Day 3, our team was demonstrating strong daily reporting to the client. This laid out positive trends in our Usability Evaluation, with transparency in our reporting and raw data at the end of each daily session.
However, there was a strong opponent voice among one of the client leadership stakeholders. There was a murmur of skepticism. Something like,
"How do we know this is intuitive based on your measurements?"
As UX professionals we know our work is based on an outsider understanding of "intuitive." We also know that "intuitive" is loosely defined, and can mean many things.
So what do we do?
We code for intuitiveness through and evaluative method called Progressive Task Completion.
Progressive task completion looks something like the following.
We successfully recoded the Testing Script for one final day of On-site Testing in order to determine intuitiveness through this method, and gained sign-off for this method from the opponent client stakeholder. Finally, we were able to analyze our results and implement accordingly.
Why am I here?
Where am I?
How does this experience relate to others I might have (in this software)?
General Ease-of-Use, Task-Completion
Other Notable Averages Include
User-Reported Scales were 7-Point Likert
Below is a looping video representation of a small portion of the final design. For proprietary reasons, the actual design is not presented.
The 10 separate walkthroughs were implemented and passed Quality and Assurance 4 weeks after Usability Evaluation concluded.
Notable core features include:
Understanding why a set of users may feel pain, lack of guidance, or ability to understand the way in which an interface mediates their their actions towards their goals is one form of practice for UX professionals. Understanding why their corporate strategy is willing to invest technology into their own user's success is another. Working through this challenge was one of the most challenging in my career, not for its scope (as it is contained to a well-worn concept of "walkthroughs"), but for the intra-corporate political strategy that brushes up against what is good for users, what is good for a company, and what kinds of ways companies wish to relate to one another.
The designs worked! The numbers showed overall success in guidance towards the primary tasks in the interface, and with almost no nudging from the moderator (myself in this instance). That is reason to be excited. After all, pushing out team alignment, a rigorous design process, as well as a thoughtful research plan in 4 days is never easy. Then, to hop on a plane, spend a week somewhere to represent the interest of users as well as a pseudo-ambassador for you company presents its own set of challenges. I am very proud of our work, and I am glad it was ultimately implemented throughout the product suite.
Along with the onset of the COVID-19 pandemic, the large client we were working to persuade ultimately had to churn. Even worse, many of their employees they staff (which our designs were meant to serve), were subsequently cut. It is hard to design for good of helping others when caught up in the interrogations of business dealings. But that just means a different frame of mind is called into the picture. I've learned about differences between design and research practices, versus operation, versus strategies. Depending on where the money lies, this is typically where we must devote energy towards persuasion, if not ultimately for the good of the user.
I want to thank my teammates.
Sonia
Sherwin