amplify

Redesigning how early literacy is taught in classrooms across America.

In 2019 I was hired to help redesign Amplify mCLASS, an early literacy assessment and instruction platform. mClass is used in classrooms across the United States to help identify student’s reading levels and provide targeted instruction.


5Months.png

5 Months

location.png

Completed at Amplify

Responsive.png

Desktop browser experience


 

MY ROLE

 

During my 5 month engagement I worked with the director of product management, 3 senior product managers and a UX designer.

Primary deliverables for which I was responsible included: 

  • User research

  • Experience mapping  

  • User stories  

  • Wireframes

  • UI Design (Design system)  

  • Prototypes

  • User testing

 

Background

Launched in 2006 as a literacy teaching assistant, Amplify mClass is designed around the University of Oregan’s early literacy skills assessment system, DIBELS.

The app is used by educators to assess a student’s reading ability and identify areas of weakness or strength. Teachers perform a series of one-on-one tests with their students with each response analyzed so that students can be grouped by proficiency level. Amplify’s platform then delivers targeted instructions that are integrated into each student’s learning plan.

The platform exists as both a tablet application (student testing) and desktop browser site (result calculation, student insights and instruction). Updated several times since its launch the platform’s steep learning curve was frustrating to many teachers struggled to locate and use many tools.

While I was involved in early tablet design concepts the majority of my time was dedicated to the redesign of the desktop browser experience.

Untitled-3.png
 

challenge

Our challenge was to evolve the existing platform in a way that simplified the highly esoteric nature of literacy education and reduced the administrative effort needed to assess students.

Organizationally we also needed to try and shift the culture of Amplify’s product team from data focussed solutions to a more user centered approach.

 

Approach

Design of the app was planned around engineering sprint cycles and sales team milestones which lead to several big decisions being made prior to the delivery of our discovery findings. This approach would later prove problematic as several solutions preferred by test groups were unable to be implemented during the first phase of launch.

Design was broken into 4 phases:

 
 
process slides_2 [Recovered].png
 

We kicked off the discovery phase with a heuristic analysis of the desktop app followed by school visits where we observed educators administering assessments and analyzing data.

Insights from these exercises allowed us to create preliminary user personas and an experience map that saw the school year segmented into 5 phases. These artifacts were used as guard rails during all concept workshops and kept teams aligned to the teacher behaviors that we were designing for.

DISCOVERY

 
Old designs.png
 
Flow.jpg
 
User flow.png
 

DESIGN FRAMEWORK

 

After completing the initial discovery phase I presented our findings at stakeholder workshop. To demonstrate the need to shift away from the existing design approach I highlighted how a change in perspective would be needed to address the problems faced in the current system.

1 - data driven.png

Design of the platform had been dictated by the availability of data rather than its context or use. We needed to create purposeful designs that aligned with existing in-class behaviors.

 
2 - self guided.png

Users of all levels of experience were struggling to use the platform due to an over reliance on acronyms, esoteric language and unclear actions. While educators would learn how to use the platform over time we needed to ensure its effectiveness from day 1.

 
3 - hidden tools.png

Many educators were unaware of the platform’s full abilities. This meant many teachers undervalued mClass and saw it as a requirement rather than an advantage.

 
4 - False affordances.png

Many actions were unpredictable and inconsistent resulting in teachers playing a game of mClass Mine Sweep. Predictable functionality would reduce frustration and allow teachers to concentrate on their analysis.

 

The experience

The new mClass experience was designed as an educator focussed literacy assistant that delivers timely and contextually relevant data throughout each phase of the calendar year.

Educators now enter the platform through The Classroom, an activity hub that spotlights recent and future actions so teachers can orientate themselves and plan accordingly.

 
1_Homepage_.png
 
 

The new task orientated approach continues throughout the designs with an omnipresent notification tab.

 
2 alert system.png
 

Data highlight panels were added to the student results page so that analysis similar to that of the class results page could was possible.

3 data analysis.png

New Table filters allow educators to isolate student groups by risk level, progress status, skills assessed and groups. 

4 Sorting.png
 

An assessment module template allows teachers to review test responses while conducting classroom or student analysis.  

Animation_3.gif

 
 

A Design System was created in parallel with the discovery phase in an effort to streamline design production. By starting this process early we were able to move quickly into hi-fidelity prototypes that were used during testing.

REsults + reflection

The introduction of the new Classroom homepage was met with almost unanimous approval. During user tests teachers of all levels of experience were able to self direct faster and more consistently. Other features, including the student data highlight panel, were liked however we learned quickly that their real value would not be truly understood until a full year cycle had been completed.

While I was happy with overall aesthetic and functionality of the delivered designs I believe the project would have benefitted from a deeper discovery phase. This need became evident while speaking to expert users who frequently asked for a more consolidated view of test results (not breaking up by time of year of assessment type).

During the discovery phase I was also able to identify multiple areas for future enhancement and growth. These included:

  1. Notes - A common behavior we learned of was that of manual note taking. Most teachers we spoke to kept a diary of how their students were performing, areas they felt needed to be addressed and personal backgrounds. These notes were then compared with other grade teachers and administrators several times a year. What we also learned was that these notes were often not carried over when the student changed grades or schools.

  2. Task highlights - One of the biggest issues I had on delivery of this project was the lack of a conventional jumping off point. While we had clearly demonstrated the need for some kind of homepage during discovery (and even held a design workshop for one) it was decided due to timeline restrictions to approach this at later stage. The urgency of this need was highlighted during user testing when inexperienced educators struggled to know which step they should take (rather than could) during specific times of year.

  3. Alerts - Following on from above, an alert system was found to be beneficial to users of all levels. While the homepage was the best place to highlight updates and needs an alert system was shown to be helpful for all other pages.

  4. User guides - A first time user experience would be helpful users when training is not available or forgotten. This was highlight be the culture of coaching that was discovered during our interviews.

  5. Moments of celebration - Educators often told us that they were so caught up in conducting tests or creating new groups that they forgot to provide positive reinforcement for their students. Simple celebration screens that could be shared with students would make the experience more fun and remind teachers to connect with students.