Simplify our learning platform by giving teachers what they need, when they need it. We needed to put analytics and insights at the fore and show teachers the why for the how.
Due to usability frustrations, teachers frequently told us they were using our resources but on a different platform, like Google Classroom. That meant lost user retention and potentially future loss of sales.
User testing and feedback showed an overwhelming preference for our new approach, some teachers claiming it was a gamechanging experience.
The session organizer is a LMS tool for teachers to help plan their day. It gives them their core lesson material along with helpful resources to put together all they need for a typical classroom session. Unless, that is, they have no idea how to use it. That was the problem we were having with the v1 of the session organizer tool. It was giving teachers too much without clear guidance on how to navigate all the material we were providing them, often leaving the act of organizing to the user rather than doing it for them.
This project started with the first in-person workshop I had attended since before the pandemic. We gathered in our Montreal office and started looking at the problem from scratch. We thoroughly analyzed the problem space, getting into our teachers' heads by going through their day to day, including after hours planning and assessment, and how it fit into our entire product ecosystem. We wireframed, we affinitized feature ideas, we storyboarded. We didn't have an end product by the end of the week, but we did have a focused list of requirements and better understanding of key features that allowed us to get started.
Our breakthrough was realizing that teachers didn't just want a bunch of content that they had to make heads or tails of themselves. Many, especially new teachers, wanted a list of actions that guided them through that content, step by step, like a checklist. But others, usually experienced teachers, still demanded that flexibility to break out of the prescribed narrative and customize their lesson plan. We needed a solution, much like the guided approach we developed for Into Math, that delivered on both while increasing engagement and usability.
The first thing we did was clean up the interface. We wanted to move from a more utilitarian design that sometimes felt overwhelming in its austerity into a more user-friendly, app-based feel. So we made the majority of space devoted to content cards. While friendlier than clunky boxes, these also allowed us to include contextual visuals and at-a-glance metrics on student progression and engagement. Navigation through our content structure would be done by "zooming" in and out of these content cards. This allowed teachers the flexibility of easily moving through the content as they pleased, something that was difficult to do prior.
Another key component was giving teachers obvious visual indicators of where they were within the content and how much of the content they had completed. A learning product is a complex system of modules, units, lessons, sessions, resources, etc. It is easy to get lost in the weeds. By providing visual indicators of "Last Taught" as well as completion status on both the cards and a useful (and collapsible, for focus) side navigation, we always grounded the user in their own progression of the content. This allowed our experienced user to chart their own path, but it was just as easy for them to go right back to where they left off because of the visual reinforcement we provided.
As previously stated, one major improvement to this version was putting clear actions up front for the user. The first way we did this was by associating a checklist of actions with every piece of content. This provided the specific actions that we felt users should accomplish when interacting with that bit of content, including extra resources that are sometimes buried elsewhere in the platform. It also provided when in the course of their lesson the action was most applicable (often either before or after teaching a session). This was great for our experienced teachers but new teachers needed an extra helping hand. That's where our command center came in. This provided a collated and sequential list of steps based on the users progression that pulled directly from these content-related actions. If they wanted, teachers could teach their entire class, day after day, week after week, entirely from this Next Step dynamic action list. Tasks were crossed off and archived as completed, prioritizing each new step while giving indication of what actions lay ahead. Even better, it put helpful progression tracking widgets right alongside this queue so they could get guidance on who and what needed extra attention.
To take all of this from storyboards and wireframes into a testable artifact we decided to try out a new development tool that has since taken over the industry: AI. Figma prototyping can be fairly tedious and clumsy, particularly when conditionals are involved and you want to highlight complex interactions. Prototyping with AI allowed us, with a bit of trial and error, to develop a more robust, feature-rich testing artifact. Using this, teachers were able to test complex interactions between navigation and task lists, such as skipping sessions (and thus skipping actions), which updated their dynamic command center queue, which updated their progression indicators. If anything, it was a bit too robust, which sometimes had the negative effect of seeming like a finished product.
Testing with actual users yielded overwhelmingly positive feedback. Testers appreciated having everything in one place with a clean, updated design. They commented that they found navigation intuitive and much less overwhelming than the original experience. One tester went so far as to say, "This is so visually genius. It's so appealing and I love how it's just all right there on one page. I love it."