
Research-wise, I played a leading role in designing the qualitative part of the study, including semi-structured interview guide and usability study scripts. Team-wise, I played the role of the optimist, cheerleader, and relationship manager who makes sure the team dynamic is conducive to both the project and the friendships. [prototype link on Figma]
Check out our final product video first!

In 2020, the Institute for Health Metrics and Evaluations played a leading role in advising policymakers throughout the pandemic, but its potential spans far beyond COVID projections. Currently, IHME wanted to expand its user base and reach more "educated yet uninvolved" users to consolidate its presence after COVID.
An immediate task after being told the target audience is "educated yet uninvolved" was narrowing down the user group. As it is "public health" is the topic of interests, we hypothesized that health researchers in the academic realm, medical workers, and general educated public who are not associated with the health field (like our team members).

Therefore, before we started the research process, I proposed that we go and have conversations with the people who fit in those groups to get a rough sense of who they are and what they (potentially) need public health data for. We constructed a demand chart based on the insights gathered from these initial conversations:


Now that we have narrowed down the target user to a specific group, I revisited and revised the research questions accordingly:

To design for a research study that is truly generative in nature and with no specific design / product to improve upon, the goal of the research is to describe and uncover unmet needs which potentially will lead to design opportunities.
I designed a 3-step, mixed-method research plan to describe the medical professional's current behaviors in searching for health data; to understand the intentions and the "whys" behind their behaviors; and to generate design requirements and principles for the design phase.

*Due to NDA with the sponsor, I am only able to share high-level findings from user research, but not specific, quantitative data from the survey.


From what we learned about the target users, we decided that whatever design solution we come up with would need to follow the below guidelines as directly derived from research:


From the design principles, our team brainstormed a plethora of ideas ranging from grounded and practical to innovative and futuristic.

After we mapped out the overall concept, a feature list is constructed from user stories we gathered from research and prioritized accordingly. I led two rounds of usability testings during the mid-fidelity and the high-fidelity stages of the design, helping the team decide what features to include and how the feature should be presented.
[see testing protocol here]
Testing Goal:
The mid-fi usability test is about the what question, finding out how effective is the current feature offerings. Therefore, the test revolves around a hypothetical search scenario and the paths the user takes. Probing questions used included:
Procedure:

[see testing protocol here]
Testing Goal:
The goal of this round of testing is to test if the design make sense to users in terms of positions, icons, word choice, and interactions. Therefore, instead of observing what actions the user takes under a general task, users are given specific instructions and the focus is on whether they are able to perform the intended actions based on the design. A specific task is given to each corresponding design component. For example:
After each testing session, our team met and discussed key findings. If the research results made it obvious that there were changes worth making for the next user, we would go ahead and make the changes right then and there. Changes are on smaller scale, such as position or design of buttons.
"As a researcher who looks for articles online a lot, I can see how this product can really help with the process!" - product showcase attendee

Usability testing might seem to some as the "easiest" method in UX research, but I learned how intricate and difficult it actually is from this project. When part of the research question is to evaluate the potential value of certain features, probing questions need to be constructed extremely artfully. For example, "do you like / would you use this feature" would likely lead to surface-level answers such as "sure" or "it's cool". On the other hand, more poignant questions question such as "does this feature provide any value to you?" and "what could be a downside of including this feature?" may yield much better results.
Our team worked closely with our project sponsor, the manager for digital experience from the IHME, throughout the process. I learned that the art of effective communication comes from a foundation of understanding each other's cost and benefits. For example, our team always approached difficult conversation with our sponsor by understanding what's important to our sponsor first, and then designed our conversations from his point of view. This way, despite having conflict of interests at certain points of the project, we were able to maintain a constructive and positive relationship throughout the project.