← Back to Portfolio

Jobs To Be Done Framework

AI-Predictive Analytics Platform UX Research Case Study

#B2B product #healthcare #JTBD framework #user journey #end-to-end research | 2011

Reflection: This project taught us that the best AI isn’t the smartest—it’s the one that helps experts do their most meaningful work.

Background

The healthcare-focused artificial intelligence (AI) platform company specializes in optimizing healthcare delivery through predictive analytics. It is designed to help hospital networks and insurance carriers to enhance individual patient care by delivering real-time predictive analytics.

Healthcare AI Platform Interface

Challenges

The current system is in its early stage and requires significant hands-on support, making interactions resource-intensive. This research aims to inform the next stage of product development, reducing consulting costs while improving system usability and the overall user experience.

Stakeholder Alignment

I met with the product manager, product designer, front-end developer, data scientists, and data engineers to understand their goals and concerns in order to get aligned on the research objectives.

Research Objectives

  1. Uncover key user jobs and motivations in their current workflow of making predictions.
  2. Explore how key users use premium predictions and what improvements would enhance their confidence and efficiency.
  3. Evaluate design and functionality improvements to ensure they effectively enhance usability and reduce friction.

Research Method

  • Semi-structured interview
  • Observational studies
  • Usability testing (post design change)

Secondary study and existing data

I conducted a secondary study on AI and predictive analytics to gain a deeper understanding of the space. Additionally, I analyzed the current product version to identify resource-intensive steps and areas for optimization.

Research Framework

This research leveraged the Jobs To Be Done (JTBD) framework to uncover user behaviors and motivations, ensuring the analytics platform aligns with real user needs and expectations. It also considered users’ emotional needs, providing deeper insights to inform more user-centric design decisions.

Research Framework

Recruiting Criteria

Actuaries

  • Semi-structured interview
  • Observational studies
  • Usability testing (post design change)

Data Scientists

  • Semi-structured interview
  • Observational studies
  • Usability testing (post design change)

Research Design (Outline)

  1. Intro, set expectations.
  2. [Verbal] Invite participants to talk about the overall process of making predictions and the challenges along the way.
  3. [Observe] Ask participants to walk through the process of making predictions while verbalize their thoughts along the way.
  4. [Verbal] Probe into the details observed, what they did, where they went, what they used, and the goal at each step.
  5. [Outro] Any other comments.

Data Analysis

Thematic Analysis - Identifying Patterns & Themes in Qualitative Data

Affinity Mapping - Clustering User Insights by Behavior & Motivation

User Journey Mapping - Tracking Steps in Completing a Job & Identifying Friction Points

Data Image
Data Image
Data Image

Key Insights

1. Improve and automate data pipelines, with dashboard reflecting data health.

Job - Trust the data before modeling

Motivation: Users need confidence and efficiency to make sure the data they work with is healthy.

Pain point: Data scientists spend almost 70% of the time fix mismatched data from EHRs and claims systems, often feeling anxiety about data quality.

2. Embed explanation into model outputs to increase the interpretability and explainability, translate technical predictions into actionable business narratives.

Job - Explain the data

Motivation: Users need to persuade stakeholders (e.g., executives, regulators) that premiums are both fair and profitable.

Pain point: Reporting to stakeholders is an important part of actuaries' workflow, unexplained models erode trust. Stakeholders demand transparency to ensure fairness, compliance, and accountability.

3. Introduce collaboration hubs for seamless collaboration and feedback loops.

Job - Collaborate cross-functionally

Motivation: Users work with cross-functional teams to refine premium predictions to share back, align assumptions, and track changes so that they can make decisions faster without errors from miscommunication or version conflicts.

Pain point: “We email Excel files back and forth and hope nothing breaks.”

Impact

  1. Strategic product roadmap prioritization - Insights revealed that the target users prioritize data trust and explainability, shifting from consulting to self-serve.
  2. User centric design foundation - Insights informed role-specific user needs, reducing guess work and future rework.
  3. Foundation for human-entered AI design - Insights informed that AI should augment human expertise, not replace judgment, which laid down a solid foundation for future work related to AI - AI’s success isn’t measured by how smart it seems, but by how well it serves the humans who rely on it.

Reflection

While utilizing the Jobs To Be Done (JTBD) framework worked really well in prioritizing user jobs over feature lists kept the team aligned on high-impact solutions, I'd propose to follow up with a quantitative study measuring user perceived importance to inform a phased rollout of new features for progressive adoption.