top of page

Case Study: eMentor

Designing an “Effortless” Mentorship Experience - UX   Experience Architecture and Roadmap

Case Study: eMentor

 
Designing an “Effortless” Mentorship Experience - UX   Experience Architecture and Roadmap 
Why this project mattered
Mentorship is one of those things that sounds simple… until you try to run it inside a real organization. Matching people takes time. Adoption is fragile. And if onboarding feels confusing, the program dies quietly.
The stakes are real: mentoring is strongly linked to retention and engagement in the workplace. For example, research summaries and reporting have pointed to higher retention for mentors/mentees versus non-participants, and mentoring platforms are growing because companies want scalable programs. 
So the mission was clear: make mentorship feel easy — without dumbing it down.
 
How we saw things in the beginning 

  • My role: UX Architect (experience strategy + UX structure)

  • What I produced: UX roadmap, lifecycle + journey architecture, principles, navigation strategy, onboarding strategy, feature model

  • Core problem: Users were being asked to do “extra work” (setup, profile, matching, goals) without clear guidance or momentum

  • North star: Effortless = clear path, fewer steps, trust, and the right info at the right time

  • Tools/methods: Workshops, journey mapping, persona modeling, concept definition, prototype-ready requirements

 
The situation
eMentor is a mentoring platform with multiple user types, and they don’t all care about the same thing:

  • Mentees want growth without friction

  • Admins want program outcomes and less manual work

  • Stakeholders/procurement want proof, clarity, and confidence before buying

The platform was already functioning, but the experience was not “self-explanatory.” People needed too much help, and the journey didn’t guide users toward success.
 
We have a challenge for the design team

 

How do we make mentoring feel effortless — while still collecting the right info to match people well and track progress?
This wasn’t just UI polish. This was an experience in architecture:

  • What happens before login?

  • What’s the first moment of value?

  • How do we reduce drop-off?

  • How do we make matching feel trustworthy?

  • How do we scale this across programs without rebuilding everything?

 
Who we designed for
I built persona-driven requirements so the roadmap didn’t become opinion-driven.
Primary personas (simplified)
Persona
What they’re trying to do
What breaks trust / causes drop-off
Mentee (Sarah)
Get a mentor, grow skills, stay motivated
Too many steps, unclear next steps, awkward matching/rejection
Admin (Sam)
Create matches, monitor progress, report success
Manual workflows, weak signals, hard-to-measure outcomes
Stakeholder (Ashley)
Decide if this is worth it + get buy-in
Long demos, unclear value fast, missing “must-have” features
This is the type of decision-maker who needs the platform to feel structured, guided, and measurable.
 
What we inherited (current journey pain points)
Here’s what was happening in the existing journey (in plain English):

  • No self-enrollment — users can’t even start unless invited

  • Forced matches sometimes happen during onboarding/training

  • Users don’t know what to do next once they hit the dashboard

  • Core tasks require bouncing across multiple areas (3+ places to complete one goal)

  • Matching wasn’t “delightful” and didn’t build momentum

  • Support became reactive (people ask for help instead of the product guiding them)

This is the exact kind of friction that causes programs to fail, because mentoring only works when people keep showing up. 
 
Lets lower the amount of effort for the user - we can make this effortless 
I defined “effortless” in a way that design, product, and stakeholders could all align around.
Effortless means:

  • Show you care: users should feel like their goals matter

  • Make my path clear: guide me to the right next step

  • Make my tasks simple: remove extra effort and decision fatigue

  • Guide my journey: give me the right info when I need it

  • Be my advocate: proactive support, not just help docs

  • Personalize: make it feel tailored, not generic

This aligns with established service design thinking: reduce steps, remove dead ends, communicate clearly, and treat user data with care. 
 
Our approach ( how we handled product management for the first time) 
I used a roadmap flow that moves teams from “opinions” to “decisions”:

  1. Align on the current state

  2. Explore the ideal experience

  3. Ideate concepts + testability

  4. Specify the moments that drive satisfaction

  5. Align all workflows, kpi’s and stakeholder expectations that are recorded and tracked ( Daily, weekly, Quarterly) 

  6. Optimize our analytic systems to record and track data points and scenarios of interest

  7. Coordinate with the dev team and stakeholders from phase to phase

 
 
The solution
1) A clearer user lifecycle (so the product “knows” where you are)
Here is a snapshot ( our lifecycle) 

  • Aware 

  • Explore  

  • Establish relationship 

  • Configure  

  • Purchase  

  • Partner  

  • Service  

  • Guide through events

That lifecycle became the structure for:

  • onboarding steps

  • what the dashboard shows

  • what notifications say

  • what “success” looks like at each stage

 
2) Navigation that supports repeated use
Mentorship is not a one-time task. People come back weekly. So navigation needed to support repeat behavior without confusion.
I proposed a navigation model centered on:

  • Matches (the core relationship hub)

  • My Journey (profile + enrollment info + progress)

  • Xchanges (structured interactions, topics, goals)

  • Resources (program objective content)

  • Surveys/Journal/Analytics (reflection + outcomes)

This lines up with guidance used in complex services: navigation should support multi-task, repeat users — but the journey should still be simplified first.
3) Onboarding that teaches without overwhelming
This was a big one. If onboarding fails, nothing else matters.
I designed onboarding as a system, not a screen.
Three onboarding styles (used intentionally)

  • Annotated (tooltips / hotspots)

  • Embedded (modals, checklists, banners at the right moment)

  • Dedicated (distraction-free setup flow for higher-effort tasks)

This matches modern best practices in onboarding and feature discovery: teach in small moments, control timing, and avoid blasting users with everything at once. Apple’s own developer guidance on writing and feature discovery reinforces this idea: clear language and well-timed education makes products easier to adopt. What first-time onboarding covered (example)

  • Confirm role (mentor vs mentee vs admin)

  • Select topics + goals (so matching has meaning)

  • Profile setup (photo/avatar, summary, preferences)

  • First match preview + “what to do next” checklist

 
4) Matching that feels trustworthy and scalable
Matching is the heartbeat of the product. So I structured features around reducing “match anxiety” and increasing confidence.
Matching system components 

  • Match setup wizard (guided criteria instead of guessing)

  • Match cards with clear summaries and confidence signals

  • Personality assessment (optional, but structured)

  • Recommendations (complementary areas, suggested topics)

  • Controls (limits for Xchanges, availability toggles)

  • Human-friendly rejection/denial language (so it doesn’t feel harsh)

Where we needed to introduce complexity, we used a “reveal it when needed” approach (progressive disclosure) so users don’t see the whole machine at once.
5) Progress tracking that leaders can actually report
Admins and stakeholders need outcomes. So the roadmap is built in measurement thinking, not just UI.
Examples of trackable KPIs 

  • hours spent in mentoring activity

  • contacts made

  • meetings scheduled

  • profile completion rate

  • participation streaks

  • survey completion

  • goal progress

Also: connecting success back to real program goals (retention, promotion readiness, engagement). Mentoring is often funded based on these outcomes, so product UX must support them.
 
6) Motivation without gimmicks
Mentoring platforms die when they feel like homework.
So I included lightweight motivation mechanics:

  • badges (earned through meaningful actions)

  • recognition moments

  • “quick actions” to reduce effort

  • simple nudges to increase usage over time

The key was incentives tied to real progress, not random gamification.
 
7) Integrations to reduce extra effort
To reduce “one more place I have to go,” the roadmap included:

  • Zoom / Slack integrations

  • SSO / profile imports (where appropriate)

  • communication center support

This supports the “less effort” principle that good services aim for: users should do as few things as possible, and the system should feel familiar. 
 
I didn’t just design screens — We designed the system:

  • lifecycle model

  • experience principles

  • navigation logic

  • onboarding strategy

  • matching framework

  • measurement model

  • scalability across programs

I designed for trust / clarity:

  • matching transparency

  • respectful language

  • guided steps

  • no dead ends

That mindset is directly aligned with how high-performing product teams define “good”: clear purpose, few steps, consistent experience, and respectful handling of user information. 
 
Outcomes (what this roadmap enabled)
I’m keeping this honest: this deliverable was an experience roadmap + architecture plan, so the “results” are what it unlocked:

  • A shared definition of “effortless” that teams could design against

  • A journey structure that reduced random feature building

  • A clear onboarding strategy (annotated + embedded + dedicated)

  • A matching framework that could scale across programs

  • A measurable model for admin/stakeholder reporting

  • A roadmap-ready feature set organized by user value (not by internal opinion)

 
What I’d do next (if I owned the next phase)
If I were driving this into build and iteration, I’d run:

  1. Concept testing on onboarding + matching (quick prototypes)

  2. Instrumentation plan for:

    • activation rate (invite → profile complete)

    • time to first match

    • match acceptance rate

    • time to first meeting

    • 30/60/90-day engagement

  3. Message testing for the most sensitive moments:

    • denial/rejection language

    • “why we recommend this match” explanations

  4. A “success dashboard” for admins:

    • what’s working, what’s stuck, who needs a nudge

 
What I learned
Mentorship platforms don’t fail because they lack features.
They fail because users don’t feel guided.
So the work wasn’t “add more.”
It was: make it clear, make it simple, make it personal, and make it measurable.

bottom of page