The aim was to improve Revel, the 'read a little and do a little' Pearson learning tool, by providing an enhanced instructor experience with flexible assignment management. We also created performance analytics to help instructors make data-informed decisions, such as how to prepare for the next class and which students need extra instruction assistance.
The discovery phase of this project taught us that instructors are overwhelmed by rapidly evolving technology and systems in the modern classroom. It quickly became clear that the solution was to distill information into smaller pieces that will help instructors monitor grades while understanding how to interact with students in a clear and effective way.
Revel now offers an updated instructor experience, based on research in the classroom and feedback directly from instructors throughout the design process. The platform currently provides advanced personalization, adaptive learning systems that improve outcomes of education in real-time and allows instructors to identify early interventions.
Teammates
Educators interviewed
User research studies
Patents pending
Our team delivered an enhanced Revel experience that is revived & instinctive to promote better learning outcomes. We built around a framework of welcoming visual design elements, along with simple IA, to provide structure.
Personalized teaching management experience for instructors:
Attempting to streamline our process and meet the instructors' necessities directly, we used the Lean UX framework. For the first time, we executed Lean methods and ceremonies where possible in the Pearson org & team dynamics.
Collaboratively, we were able to iterative conversations and forward progress a priority over documentation and deliverables. We concentrated on the needs of our customers, finding ways to meet them; all while defining and deliver upon the product and engineering roadmap to realistically build the solution.
Fully grasping the in-classroom problems by spending time with instructors
At the beginning of each sprint, our team would converge on a distinct hypothesis to test and validate. Then we would prototype at varying levels and fidelity of design with instructors in the second week of sprint work.
The hypothesis afforded a precise focus for the particular sprint. For example: "We believe that instructors will discern actionable insights into student performance with progressive views focused on timeliness and relevance."
We relied heavily on a collaborative approach to design decisions made throughout every phase of development. Each sprint, we worked hard to guarantee remote teammates felt welcomed in our collaboration gatherings.
Once we fine-tuned these design studio sessions, it became clear that converging ideas checked a lot of boxes right out of the gate and made it easy to get feedback from instructors instantly with user inquiry every sprint.
Every two weeks, we accomplished the following:
When prototyping, we employ the lowest fidelity first to test our ideas and learn fast. Paper keeps the initial solutions adaptable and promotes a more comprehensive spectrum of responses to help us thoroughly understand the mindset and mental models of our customers walking into the classroom.
I enjoy starting with sketches on paper to clarify solutions and get the essential context for discussions with users. This also provides the opportunity to iterate on many divergent ideas before converging with teammates.
We allocate the majority of the first week considering numerous diverse paths and lend little time making sketches perfect. "Each design is a proposed business solution. Your goal is to validate the proposed solution as efficiently as possible by using customer feedback," wrote Jeff Gothelf in Lean UX.
The remainder of the first week is committed to outlining ideal test flows and building a basic shared understanding of the prototype we're aiming to produce.
Our team as a whole provides many ideas in timed design studio activities using low-fidelity sketches.
I'd evaluate the possibilities and then repeat with more in-depth details on the whiteboard before blocking out paper prototypes with markers and coffee.
This is where we find the flow that works best for instructors
Monday and Tuesday of the second week are devoted to polishing the test prototypes and capturing all of our assumptions in a script that our researcher will use to guide the conversations in user interviews with instructors.
The entire team listens in on the remotely moderated tests held on Wednesday and Thursday. By the end of the second day, we all have a shared understanding of validated assumptions along with modifications or needed.
Sample review of significant insights uncovered with paper prototypes:
- Tell a clear story w/ "Temperature Check" as fundamental element of the flow
- Clarify and simplify data displayed, progressively reveal details as needed
- Highlight data most suitable to where instructors are in the course lifecycle
- View results and completion data by student and assignment organized lists
Approaching design decisions as a team to keep moving toward solutions to real user issues
On the last Friday of a sprint, we're invigorated to discuss outcomes and lessons learned with a broad group of stakeholders and partners. We review the designs tested and outcomes before agreeing upon the next steps.
The feedback from our broader team on demo day, taken in connection with what we learned from instructors, guides the focus of the next sprint. This simple dialogue helps our team iterate on the top priority work and develop greater confidence and shared understanding in our design solutions.
Creating together involves carving out space and freedom to collaborate routinely. Team members are seldom sitting at their desks, heads down, sweating through a problem alone. Instead, we establish a war room to centralize our focus on solving the small and large challenges that arise.
The designs emerged iteratively based on a never-ending loop of user feedback to help deliver a personalized teaching experience based on realistic outcomes. Throughout the year, we interviewed instructors across many different disciplines using both in-person and remote for feedback at many different stages of discovery research and design delivery.
262
30
2
The advantage of pushing ourselves in a collaborative gathering to get a pulse from our uers rapidly is an easy trade-off compared to the taxing stress of dreaming up killer ideas in a vacuum with no measurement of usability and handing off waterfall designs to development. Maintaining the constant feedback loop as a team with Instructors actually takes the weight off our shoulders. It frees our collective mind to better solutions to REAL problems.
The sprint cycles encapsulate their own stressors. We're accountable for determining what to solve, how to solve it, and building out a testable prototype flow for eagle-eyed college professors in a week. It can be fast-paced, but we're tackling problems head-on, and the impact is tangible. It's vital to seek out the delicate balance of creative openness to encourage wild ideas along with accountability for the team to get anything on the page.
Ensured direct validation from our partners and stakeholders
Incorporated research during iterative validation stages
Revealed workflows that work best for time-strapped instructors
Addressed details that might not have surfaced until initial release
Infused assurance that we're answering customer future needs
It was imperative to present instructors with essential details they need most. The grades view clearly flags vital information about student performance and content to possibly review in the next classroom session with students.
The assessment detail pages highlight class-wide performance and knowledge gaps, including questions that the students found challenging based on 'correct on first-try' calculations. These deep dive detail pages concentrate on overall score and assignment progress along with scannable learning outcomes.
We presented instructors with details about students who are struggling with specific content to cover in class or students who aren't completing designated work. These details are progressively offered at three different levels, including varied timeframe filters.
Throughout the experience, a direct messaging system allows the instructor to reach out to particular students about their grades and submissions - including batch emails to struggling cohorts.
The reimagined gradebook surfaces essential grade statistics associated with each assignment. The improved flagging system highlights student submissions that need to be graded along with challenging content to be evaluated and possibly incorporated into a lesson during an upcoming class.
Visual design implemented throughout the process:
From start to finish, the inclusive and collaborative design methods enabled our team to study the problem inside and out, examine various ideas, fail quickly, and iterate to serve the customer and the always-evolving classroom.
As cross-functional partners, we worked quickly and collaboratively to get something, anything, in front of the user for validation as early as possible.
This is how we strengthened and delivered innovations to help users identify:
Answering the question: "Who is struggling in class?"
Quickly identify which students are having a hard time with specific material or other who aren't completing homework
We eliminated deliverable waste and steered toward the most basic MVP for testing to fuel continuous discovery. Our rapid learning cycle outfitted the team with info to better adapt and respond w/ change to provide true value.
"It's good feedback if I want to ask a student 'what's up?' We want to contact students that are falling behind, so this would be extremely useful."
"Feels like a modern piece of technology. Not only is it data-driven, but it's user friendly. The interface is clearly direct, so it's not intimidating."
Throughout the design process our team was named 'HMS Belafonte' and many collab sessions were themed by the intrepid spirit Steve Zissou exuded throughout 'The Life Aquatic.' Below is a sampling of our collab session decks.