The software gets a failing grade. Creating a better tool for teachers.

National Heritage Academy operates over 80 charter schools across five different states. Two years ago, after realizing their current software product wasn’t meeting their needs, they decided to build a custom tool in-house to track student progress and drive reporting to parents. Unfortunately, budget and time constraints led to a rushed project built primarily with short-term contractors—and no real user research. The result was a system that was overly complex and under-utilized, and plagued with performance issues and usability challenges. We were asked to help them not only design a better system, but to also design a better process for building software.

The approach: educating a team on the necessity of user research

After some initial workshops with the team to identify assumptions and set project goals, we planned a discovery process that would get us into the schools and classrooms and use design as a tool to build understanding and explore possibilities.

The bulk of our user research took the form of contextual inquiry. We visited teachers and deans, and they talked to us about how they teach, how they track progress, and how they customize their instruction based on student needs. As we learned more about our users, we employed collaborative design activities as a tool to build and share understanding. We met with our project stakeholders on a weekly basis to share insights and designs.

After six weeks of research and design, even the business owners closest to this product were starting to understand the problems and challenges in a completely new way. Our research revealed that the previous system was not built for real-world classroom scenarios. Teachers were working around and outside of the system, which meant the business was losing out on a tremendous amount of information that they could be using to improve the classroom experience.

The team decided that they had to change not just the design of the tool, but how they built and delivered it. In the past, they would throw all their resources at a single project a time, release a full feature set, and then move on to the next project; but that clearly wasn’t working. At the same time, there are some things you can’t learn until you put a working product in front of users. Collectively, we decided to use the development process as a tool not just to build software, but to continue learning.

The results: building to learn

We decided to take a true agile approach: focusing on delivering value with a small feature set, as early as possible, and then determine next steps based on what we learned from our users. To minimize risk and enhance the feedback loop, we decided to release our MVP to a small set of five pilot schools. This will allow us to develop close relationships with users and advocates at each school. The initial release to the pilot schools will include the most essential features identified during our discovery, and we’ll continue to enhance, revise, and add features based on user feedback.


Key insights

The system assumes all students are performing their work at the grade level in which they are enrolled (best case/ideal world scenario). In actuality, the majority of teachers have to accommodate a classroom filled with different levels of competency, with student needs changing from day-to-day, subject-to-subject, standard-to-standard.

To meet each student where they are at, teachers are working around and outside of the system–because there is no way for them to work inside the system. There is a tremendous amount of information about each student’s work that is not being recorded in the system.

Finally, even when grades are getting recorded, they are not necessarily a good indication of the student’s performance. Many teachers do not have a true understanding of the scoring system in use at their school, plus there are a number of scoring variables and weighting rules for grade calculations that are completely opaque to the teachers. Many times, the grade reported for a student is different than the grade they actually earned.