The aim was to improve Revel, the 'read a little and do a little' Pearson learning tool, by designing an enhanced instructor experience with flexible assignment management. We also created performance analytics that helped instructors make data-informed decisions, such as how to prepare for class, and which students needed extra instruction assistance.
The discovery phase of this project taught us that instructors were overwhelmed by rapidly evolving technology and systems in the modern classroom. It quickly became clear that the solution was to distill information into actionable insights that helped instructors monitor grades while understanding how to interact with students effectively.
Revel now offers an updated instructor experience based on research in the classroom and feedback directly from instructors throughout the design process. The platform currently provides advanced personalization along with adaptive learning systems that improve outcomes of education in real-time and allows instructors to identify early interventions.
Revel has received positive feedback and improved outcome scores since our major update to the platform. Users have responded well to the platform's updated features and redesign. Revel improved students' course grades and exam scores by 5% or more across various disciplines. Numerous research studies have proven students are more prepared.
Teammates
Educators interviewed
User research studies
Patents filed
Our team delivered an enhanced Revel experience that was revived to promote better learning outcomes. We built around a framework of welcoming visual elements, along with intuitive information architecture, that provided structure.
Revel now also offers an updated instructor experience based on research in the classroom and feedback directly from instructors throughout the design process. The platform currently provides advanced personalization along with adaptive learning systems that improve outcomes of education in real-time.
Personalized teaching management experience shaped for instructors:
In an effort to streamline our process and directly meet the instructors' necessities, we utilized the Lean UX framework. For the first time, we implemented Lean methods and sprint cycle ceremonies whenever possible within the Pearson organizational & cross-functional team dynamics.
We fostered iterative conversations and propelled progress with an emphasis on collaboration over documentation and deliverables. We focused on the needs of our customers, discovered ways to meet them, and delivered upon the product and engineering roadmap to realistically upscale the solution.
At the beginning of each sprint, our team converged on a distinct hypothesis to test and validate. We then prototyped at varying levels of design fidelity with instructors in the second week of sprint work. The hypothesis afforded a precise focus for the particular sprint across our team. For example:
"We believe that instructors will discern actionable insights from class performance with progressive views focused on recency and relevance."
On the last Friday of a sprint, we were invigorated to discuss outcomes and lessons learned with a broad group of stakeholders and partners. We reviewed the designs tested and hypothesis-related outcomes that were validated or not. The feedback from our broader team on demo day, taken in connection with what we learned from instructors, guided the focus of the next sprint.
This simple dialogue helped our team iterate on the top priority work and develop greater confidence along with a shared understanding of the solutions.
We relied heavily on a collaborative approach to the design decisions made throughout every phase of development. Each sprint, we worked hard to ensure remote teammates felt welcomed in our collaboration gatherings.
Once we fine-tuned these design studio sessions, ideas converged and made it easy to gather feedback from instructors with user inquiry every single sprint.
Every two week sprint, we accomplished the following:
When prototyping, we employed the lowest fidelity first to test ideas and learn quickly. Paper kept the initial solutions adaptable and promoted a more comprehensive spectrum of responses that helped thoroughly understand the mindset and mental models of our customers who walked into the classroom.
I enjoyed starting with sketches on paper to clarify solutions and captured the essential context for discussions with users. This also provided the opportunity to iterate on many divergent ideas before converging with my teammates.
We allocated the majority of the first week to considering numerous diverse paths and lent little time to perfecting sketches. The two-week sprint cycles encapsulated their own stressors. We'd been accountable for determining what to solve, how to solve it, and how to best build out a testable prototype flow for eagle-eyed professors in less than two weeks. Time was a driving motivator.
The remainder of the first week was committed to outlining ideal test flows and developing a shared understanding of the prototype we aimed test.
The team contributed idea outlines during timed design studio activities using rough sketches.
I'd evaluate the possibilities and then repeat with more in-depth details blocked out paper prototypes with markers.
This is where we uncovered the flow that worked best for instructors to be most efficient
Monday and Tuesday of the second week were devoted to polishing the test prototypes and capturing all of our assumptions in a script that our team used to guide the feedback conversations in user interviews with instructors.
The entire team listened in on the remotely moderated tests held on Wednesday and Thursday. By the end of the second day, we all had a shared understanding of validated assumptions along with modifications needed.
Sample review of significant insights uncovered with paper prototypes:
- Tell a clear story w/ "Temperature Check" as fundamental element of the flow
- Clarify and simplify data displayed, progressively reveal details as needed
- Highlight data most suitable to where instructors are in the course lifecycle
- View results and completion data by student and assignment organized lists
The designs emerged iteratively based on a never-ending loop of user feedback that helped deliver a personalized teaching experience based on realistic outcomes. Throughout the year, we interviewed instructors across many different disciplines using both in-person and remote for feedback at many different stages of agile discovery UXR and design delivery.
262
30
2
It was important instructors were presented with details they needed most for action. The grades view clearly flagged vital information about students' performance and content to review in upcoming class sessions with students.
The assessment detail pages highlighted class-wide performance and knowledge gaps, including questions that students found challenging based on 'correct on first-try' calculations. These deep-dive details concentrated on the overall score and assignment progress with scannable learning outcomes.
The assessment detail pages highlighted the class-wide performance and knowledge gaps, including questions that students found challenging based on 'correct on first-try' calculations. Those deep-dive details concentrated on overall score and assignment progress along with scannable learning outcomes.
Throughout the experience, a direct messaging system allowed the instructor to reach out to particular students about their grades and submissions - including batch emails to struggling cohorts.
The reimagined gradebook surfaced essential grade statistics associated with each assignment. The improved flagging system highlights student submissions that need to be graded along with challenging content to be evaluated and possibly incorporated into a lesson during an upcoming class.
Visual design elements implemented throughout the process:
From start to finish, the inclusive and collaborative design methods enabled our team to study the problem inside and out, examined various ideas, failed quickly, and iterated to serve customers and the always-evolving classroom.
As cross-functional partners, we worked quickly and collaboratively to get something, anything, in front of the user for validation as early as possible.
This is how we strengthened and delivered innovations to help users identify:
Answering the question: "Who is struggling in class?"
Quickly identify which students are having a hard time with specific material or other who aren't completing homework
We eliminated deliverable waste and steered toward the most basic MVP for testing that fueled continuous discovery. Our rapid learning cycles outfitted the team with info to better adapt and respond with change to provide true value.
"It's good feedback if I want to ask a student 'what's up?' We want to contact students that are falling behind, so this would be extremely useful."
"Feels like a modern piece of technology. Not only is it data-driven, but it's user friendly. The interface is clearly direct, so it's not intimidating."
Throughout the design process our team was named 'HMS Belafonte' and many collab sessions were themed by the intrepid spirit Steve Zissou exuded throughout 'The Life Aquatic.' Below is a sampling of sprint and design studio decks.