In most math courses, concepts were taught to the average student profile - there was no personalization of instruction, and everyone was given the same content regardless of their level of comprehension. We aimed to measure college-level remedial math students' proficiency to the objectives and developed personalized learning plans and assessments.
Code-name Project RIO was a developmental math product aimed to help 2-year college students learn vital math skills using adaptive technology. The main focus from a design process perspective was the working triad between Product, Engineering, and Design. This project was the first time we'd worked closely as partners in early agile discovery phases.
This learning experience was a new product for developmental math students aimed to deliver the highest level of personalized content. Students control the pace at which they learn based on their answers to specific questions. Ultimately, students felt more confident by focusing only on topics they didn’t know and comprehended the material stress-free.
Developed diagnostic testing software that was well-received by instructors; they expressed their intention to incorporate it into their teaching practices. This was evident from the high TAM score of 4.55 across all tests. Positive feedback received from pilot programs indicated similar level of acceptance. Improved agile processes led to validation.
Teammates
Educators interviewed
User research studies
Avg. TAM Rating
Learners experienced a personalized instruction journey, starting with a pre-assessment. Course content was tailored to what each student needed to work on to master a concept or skill. Student performance dictated the adaptive practice within each module and provided optimal support for each individual.
Code-name Project RIO introduced advanced performance reporting and provided at-risk learner insights so instructors could support students before it was too late. Instructors controlled setting a distinct 'Proficiency Threshold.'
Following their interaction with our performance dashboard, an overwhelming majority of instructors expressed a strong desire for swift implementation of the adaptive developmental math experience in their teaching environments.
The learning experience addressed personalized content, recognized where students excelled and struggled to best remediate students at the moment and in the precise method they needed it, thereby accelerating proficiency.
Innovative product and systems thinking approaches helped advance personalized and adaptive learning systems to improve learning outcomes. Within the initial innovation cycle, we enhanced the learning experience by refining instructors' interactions with data visualizations and implemented alert simplifications, leading to measurable improvements in usability metrics.
Micro-interactions that took place as a student worked through a problem ensured students got the right help at the right time, to foster motivation and success.
The course pre-assessment had given insight into precisely which topics students had already mastered and which topics they had yet to learn. Based on the pre-assessment, content was then tailored to each student.
As students reviewed streamlined instruction for each objective they needed to learn, practice was adapted to each student's level of understanding, providing scaffolding for those who needed it while also fading learning aids for students to demonstrate true understanding across learning objectives.
The developmental math courseware aimed to help students progress through developmental math content efficiently and successfully build upon vital concepts, so they could move on to credit-bearing math courses. The tool understood where they struggled based on answers and inputs to then deliver a personalized learning path for each student with adaptive practice.
The learning tool provided streamlined personalized instruction and practice through lab-led software. The tool encouraged acceleration, and fit into any course format where instructors wanted students learning at their own pace.
Product Differentiators:
The adaptive pre-assessment focused on reporting results at student and class levels. RIO aimed to help faculty improve developmental math retention and completion rates. Sixty percent of community college students were referred to developmental math, yet only a third of those students went on to finish.
We kicked off the project using a co-design session with community college developmental math instructors. We sat down to better understand what impeded the teaching experience from a class setup process standpoint.
We aimed to understand how students learn best in the modern classroom from their perspective. The sessions were split into back-and-forth interview questions and collaborative design tasks to address and solve big issues.
Start with a temperature check and set the benchmark
Understand what to spend time on and when in class
Identify low performers so they can intervene earlier
Group high and low performers so they can help each other
Review with students and create individualized study plans
The research sessions opened w/ a 40-minute interview that informed the co-design exercise where I would build on the screen what instructors wanted to see in a performance dashboard. This helped us better understand what truly mattered to assess the class quickly. We kept the back-and-forth fluid to create an open space for rapid validation that included correction from the instructor and on-the-fly iterations based on what they were seeing. Co-design findings:
We presented instructors with details about students who struggled with specific content to cover in class or students who didn't complete designated work. We let users know what was going on and presented helpful information.
These navigation and data point presentation options were good examples of collaborating with users before presenting multiple scale options to partners across product and engineering to ensure feasibility, which might include possible API contracts for data point delivery from the adaptive system.
It was important to ensure that instructors found the information and data clear and simple to use when assessing student content needs. We worked with product, engineering, and data science leads to grasp timelines for a simple MVP option that we validated with instructors. There was also potential for improvement as we iterated and improved upon the initial release.
We found any open-ended feedback session interesting because it reinforced how pressed for time these instructors truly were. The majority of instructors wanted some level of control but didn't have time to configure an endless page of settings they didn't understand. They also wanted insights and actions specifically spelled out for them with the ability to dive deeper for more background information.
73
9
4.55
We further researched initial designs with instructors through one-click tests and task completion flows along with design element preference split tests at the end of sessions.
These sessions were unmoderated and it was always interesting to hear the direct feedback from instructors when they open up and feel safe talking through their thought process out loud.
It was essential to be thoughtful when interfacing with collaborators at the right moments in the design process. This visual was shared with stakeholders at sprint planning to align shared understanding for upcoming usability testing sessions.
Through iterative testing it became clear that multiple entry points to specific information was important. We didn’t get this right out of the gate, as instructors sometimes wanted to dig into the deep information from many different navigation points in the experience when testing with task completion.
Diving into the details, I found through rapid prototyping that instructors needed to understand the class-wide basics:
- 'What student is this?' & 'How did they do?'
- Let me see a summary of the questions
- Let me see if they got it right or wrong
- Let me navigate to their answer and see the error type
"Overall, it's really nicely displayed. Easy to click around and drill into it the specific chapters to plan lessons."
We developed diagnostic testing software that was well-received by instructors; they expressed their intention to incorporate it into their teaching practices. This was evident from the high Technology Acceptance Model (TAM) score of 4.55 across all tests. The positive feedback received from the pilot also indicated a similar level of acceptance. Our fast agile workflow required flexible and lean testing protocols and emphasized quantitative patterns.
Task success (Navigation, Discovery, etc.) was 100%
These data points were clear wins according to any task success benchmark. Instructors rated how usable and useful the product was. The Usability TAM score on last end-to-end test was 4.50. Industry-accepted benchmark rating that predicts adoption is 3.00. At the end of the cycle, we had designed dynamic learning journeys to equip instructors with the tools needed to teach better.
TAM Score Average across all tests was 4.55
"It's very easy to see who needs help and who I would have a conversation with. It's also helpful to see the areas where I could shave off instruction time for high performing chapters."
Many instructors told us they spent substantial time trying to identify students who weren’t proficient in fundamental skills like basic arithmetic. Skills “they should have learned in second grade.” Students brought these deficiencies all the way to college. More broadly, instructors wanted to assess skills that pre-dated and were pre-reqs for the current course they were in. Diagnostic largely assessed proficiency of the content that would be covered in dev math class.
It was important to move from useful and usable results to innovative, workflow-changing insights. We had been successful surfacing and displaying metrics instructors relied on and found useful, but we came up with better and more innovative ways to meet the need behind it. Instead of Instructors sifting through last logins to see who had been working, we pushed actionable alerts.
"The information displayed w/in the tool is helpful and direct. The design is very visually appealing and it's easy to select a chapter and quickly assess where I should focus my time."
We ran an end-of-the-year retrospective to identify possible improvement areas moving into future iterations and evolutions covering:
There were obvious themes discovered across our retro cards focused on improved overall planning for effectiveness and efficiency by transparent communication. With that being said, there were threads of design maturity.
We all acknowledged there had been stronger collaboration within our working relationship and interactions with dev or product compared to past projects. While also identified additional tweaks to our design practice and dev process to deliver useful and enjoyable experiences for instructors to increase scores.
Looking back over the cycle, we had accelerated learning and validation of ideas through rapid iteration, resulting in significant time and cost savings. Simultaneously, ensured customer needs were met by delivering incremental value to the market. Implemented evidence-based ROI feature enhancements and user-focused satisfaction improvements discovered through testing sessions and iteratively improved over time to meet user feedback and needs.
I started each day with a creative outlet exercise to include our team name in an iconic album cover. I'd then post to Slack with the plan of the day before standup.
It was a great way to get into a creative flow to start the day and engage the team with something new every single morning.