This was a 4 month-long project where we built out a system to help our team keep track of their student’s progress in the Pathrise program. The team consisted of myself, a senior designer, and our lead engineer. This case study focuses on user research, ideation, design, testing, and iteration.

.

TLDR

  • Defined the problem by conducting user interviews, doing competitive analysis, and creating user personas.

  • Built system requirements, wireframes, and high fidelity prototypes.

  • Conducted usability testing and user interviews to gather feedback.

  • Completed iterations on design feedback for implementation.

  • Feature improved platform engagement and reduced stress for mentors and students.

My initial audit showcased a system that wasn't serving our students or mentors

Our current task assignment system lacked functionality and was difficult to track across the team.


During each session, career mentors would jot down “to-dos” for their students. Some mentors would type them by hand, some had personal snippets, or would copy paste from process documents. Some mentors assigned one goal per week, some ten per week. There was zero consistency.


Our team also experienced many process changes over the previous six months, making goal assignment more complex.

Previously, the task feature looked like this. During each session, career mentors would jot down “to-dos” for their students. Some mentors would type them by hand, some had personal snippets, or would copy paste from process documents. Some mentors assigned one goal per week, some ten per week. There was zero consistency. Our team also experienced many process changes over the previous six months, making goal assignment more complex.


This was our previous goal assignment tool. It was built into the notes section for each student, and goals were manually added during each coaching session.


This was our previous goal assignment tool. It was built into the notes section for each student, and goals were manually added during each coaching session.

Initial explorations & research

We explored and research different directions to display student progress. I started by interviewing 6 career mentors to understand how they track student progress and growth.


We asked questions like: How do you assign tasks and goals? How do you assess for student progress? These are some of our high level findings:

My research focus initially was on career mentors, but I did spot check interviewing with students as well.

  • Overwhelm. Career mentors felt overwhelmed by the countless process documents they had to gather tasks from.

  • Engagement. Some mentors were more engaged with setting goals and using the internal platform to log goals than others (this became a fun side quest!)

  • Lack of time. Mentors forgot to log tasks because our sessions were only 20 minutes long.


User interviews were key to discovering multiple failure points

  • Overwhelm. Career mentors felt overwhelmed by the countless process documents they had to gather tasks from.

  • Engagement. Some mentors were more engaged with setting goals and using the internal platform to log goals than others (this became a fun side quest!)

  • Lack of time. Mentors forgot to log tasks because our sessions were only 20 minutes long.


User interviews were key to discovering multiple failure points

Ben was a high performer and kept most of his to-dos in his calendar. Here, he is showing me how he organizes his work flow.

Most importantly, I learned that mentors were working with their students in very different ways - some were much more connection and relationship focused, using their sessions to build students up and tracking emotional progress. Others were much more data and goals focused, using each session as a way to track data points and progress quantifiably.


This difference between heart-led vs data led approaches was a major insight for our team.

Most importantly, I learned that mentors were working with their students in very different ways - some were much more connection and relationship focused, using their sessions to build students up and tracking emotional progress. Others were much more data and goals focused, using each session as a way to track data points and progress quantifiably. This difference between heart-led vs data led approaches was a major insight for our team.


More insights from additional user interviews deep dive showed a need for custom tasks, prioritization, and automated tasks.

.

More insights from additional user interviews deep dive showed a need for custom tasks, prioritization, and automated tasks.

Persona development allowed me to correlate mentor performance and level of system engagement

I created career mentor personas to help guide my research, creating an intersection of performance and engagement with the current system. I was able to collect data from Metabase as well as interview career team managers to confirm my hypothesis that power users of the platform were typically higher performers.


These personas were used to find balance in the design so as not to overwhelm low performers and to most effectively replace our high performers’ current task management systems. 

I created career mentor personas to help guide my research, creating an intersection of performance and engagement with the current system. I was able to collect data from Metabase as well as interview career team managers to confirm my hypothesis that power users of the platform were typically higher performers.


These personas were used to find balance in the design so as not to overwhelm low performers and to most effectively replace our high performers’ current task management systems. 


Managers noted ability to take feedback, positive attitude, and passion for coaching as indicators for high performance. Pretty cool!

Managers noted ability to take feedback, positive attitude, and passion for coaching as indicators for high performance. Pretty cool!

I built system requirements that painted a picture of ideal student/mentor health in the program

I developed an 8-week task list for both mentors and students. Each of these task sets had specific SLA requirements and many required tool integrations, like scheduling in Calendly or uploading docs to google drive.


The three main needs were

  • Rating their student's skills

  • Completing resume, LinkedIn, and cold email reviews

  • Scheduling meetings

.

I started drawing up requirements and different possibilities to visualize the internal tool. I wanted to come up with a few different versions to test with users. First, I gathered feedback on my ideas from my senior designer and engineer before creating wireframes.

(my assistant, Fig) ^^^


I started drawing up requirements and different possibilities to visualize the internal tool. I wanted to come up with a few different versions to test with users. First, I gathered feedback on my ideas from my senior designer and engineer before creating wireframes.

Exploring initial ideations through user interviews pointed to options for high fidelity designs

I created two separate wireframes, one showcasing a single column design, and the other a two column design that distinguished the tasks between the mentor and the student. I completed quick feedback reviews with a few career mentors, managers, and my senior product designer.

I created two separate wireframes, one showcasing a single column design, and the other a two column design that distinguished the tasks between the mentor and the student. I completed quick feedback reviews with a few career mentors, managers, and my senior product designer.


My one column design provided a more linear view of tasks for the mentor and student.

.

My one column design provided a more linear view of tasks for the mentor and student.

My two column design provided career mentor tasks on the left and student tasks on the right.

I created three high fidelity design variations, and tested them with multiple users

I conducted usability testing with 6 career mentors with 3 different designs. I asked 8 questions and timed their responses to understand what my areas of opportunity for the design were.


The three designs included

  1. Single column

  2. Double column

  3. Collapsable segments (I wanted to test a collapsible design as well to see if it helped mentors with cognitive load).

I conducted usability testing with 6 career mentors with 3 different designs. I asked 8 questions and timed their responses to understand what my areas of opportunity for the design were.


The three designs included

  • Single column

  • Double column

  • Collapsable segments


Usability testing was challenging due to first design bias. I had to swap questions and designs from user to user in order to get more balanced responses.

Usability testing was challenging due to first design bias. I had to swap questions and designs from user to user in order to get more balanced responses.

Users preferred single column and collapsible designs. However, I still wanted to test for highlighting tasks due next, because one of the main issues I found when interviewing fellows and career mentors was not knowing what to do "next" when there was a list of tasks to complete.

I tested highlighting variations due to overwhelming feedback around task prioritization

I conducted additional usability tests for highlighting during those same interviews with users. While we tested several highlighting options, I'll show the top two contenders. I also provided a "control" design that had no color highlighting to understand a baseline for success.

I conducted additional usability tests for highlighting during those same interviews with users. While we tested several highlighting options, I'll show the top two contenders. I also provided a "control" design that had no color highlighting to understand a baseline for success.


I wanted to try our brand colors for highlighting, hence the purple. I also tried a dark grey neutral highlight color.

.

I wanted to try our brand colors for highlighting, hence the purple.

Dark grey highlighting

The next design iteration included all feedback from usability tests and user interviews

Usability tests led to several changes to the final design, including a highlighting system that was lighter while still maintaining brand colors, embedded links to ease completion of specific tasks, and advocating for users with our engineering team to incorporate custom tasks.


Below, you can see the final design with all improvements made.

Usability tests led to several changes to the final design, including a highlighting system that was lighter while still maintaining brand colors, embedded links to ease completion of specific tasks, and advocating for users with our engineering team to incorporate custom tasks.


Custom tasks was something I fought for our users to have in order to maintain morale and improve initial engagement. The final design is below!

Custom tasks was something I fought for our users to have in order to maintain morale and improve initial engagement.

Tool adoption, new student engagement, and task completion all improved upon launch of the tool

We conducted a roll out plan that involved training from management and overviews of how to use the system by myself and engineering.


  • Adoption of the internal tool was 70% at the end of the first three months of launch

  • New students had 70% engagement, while those in the program for over 3 months had 40%

  • Career mentors saw a 30% improvement in task completion

  • Managers interviewed post launch said it was easier to identify individual team member pain points with task completion

We conducted a roll out plan that involved training from management and overviews of how to use the system by myself and engineering.


  • Adoption of the internal tool was 70% at the end of the first three months of launch

  • New students had 70% engagement, while those in the program for over 3 months had 40%

  • Career mentors saw a 30% improvement in task completion

  • Managers interviewed post launch said it was easier to identify individual team member pain points with task completion


My favorite success for this project was hearing from career mentors that their daily work flow was more structured and less chaotic because of this feature. <3

My favorite success for this project was hearing from career mentors that their daily work flow was more structured and less chaotic because of this feature. <3

The path forward will include immediate syncing of new processes and adding new teams for an ecosystem effect

We want to continue exploring ways to make the task management tool even more robust. Ideas included incorporating more processes into the task management tool, as well as including other teams like admissions and industry career mentors into the system so the system felt more like an "all hands on deck" type of tool that would offer full visibility cross functionally for each student.