In the summer of 2016, a community of educators, research design partners, and over 150 teens and young adults in Colorado engaged in a bold experiment to rethink how quality and impact might be measured in a modernized system of learning. In this relatively small, short-cycle prototype, the lessons we learned were significant. They are informing a set of expanded pilots taking place in 2017. We also hope that the early ideas generated through our summer prototype will spark policymakers to get creative and reimagine quality assurance at a system level, as society shifts to more learner-centered systems of education. Here’s our summer pilot story:
ReSchool Colorado (a multi-year initiative of the Donnell-Kay Foundation to design and launch a new education system), has been testing emerging concepts with willing partners in safe spaces and at safe times, like the summer, where learning is occurring outside the regular school day.
Why focus outside of formal learning time for our design work? Because creating breakthrough, systematic innovations in education is challenging. High among these reasons is justified concern about testing new ideas that could have negative repercussions on students in the United States’ fully developed education system. As a result, finding safe spaces outside of the mainstream system to try out new educational concepts is critical to advancing bold new ideas.
Never were the advantages of this approach more apparent than this past summer when ReSchool partnered with Entangled Solutions and educational providers serving teens and young adults. Our collective goal was to begin to rethink quality assurance in an educational system with an array of learning providers.
The Testing Backdrop
Our current approach of summative assessments has few fans. It is burdensome and takes big chunks of time away from learning. At conferences on education policy and innovation we have heard the system referred to as an “albatross” and the “tail that wags the dog.” But many are often loath to make a wholesale departure from that system because it has illuminated important inequities and shortcomings. As such, we need to have an alternative ready that people will value and that ensures our most vulnerable students are not left behind.
A growing number of states and local communities are examining how to shift to student-centered, competency-based learning systems. In these emergent systems, the desire is to report absolute individual growth broken out by subgroups (as opposed to norm-referenced growth or the current measure of average proficient) via on-demand assessments. This should theoretically yield greater accountability and transparency than the existing system and allow us to move beyond the emphasis on summative, year-end testing. This is because we could see in much more granular detail, and in real time, where individual learners are, both in absolute terms of mastery and how much they were growing per year. Yet, despite promising shifts to these types of new models, we are still waiting for someone to demonstrate the effectiveness of such a system with validity, reliability, usability, and scale.
Similarly, although the current accountability system measures success in narrow terms and ignores the fact that providers may want to build offerings that serve learners with different goals and needs, how to represent outcomes in a system that has so much variation is unchartered territory.
The Every Student Succeeds Act begins to create room for states to use a more expansive set of learning providers and to report student success in a variety of ways. This could lead to big changes in the underlying accountability system. Although some localities have been given permission to innovate by departing from the annual summative assessment system, they are limited in how much they can try because of the risks of allowing them to do too much as they innovate with students.
Testing Quality Assurance In The Summer
ReSchool’s Summer Learner Partners prototype provided a safe space to pilot new approaches to quality assurance in a system. We had a diverse mix of learning providers with differing educational goals and an array of assessments to measure impact.
We used ReSchool’s Framework for the Future of Learning with these nontraditional education providers to help students become better self-managers of their own learning--one of the four domains of the Framework. The other three domains, as illustrated below, are: academically prepared, socially intelligent, and solution seeker.
The Framework focuses on success in the educational, economic, social and civic tasks of adulthood. Its core purpose is to ensure all learners have access to a multitude of rich developmental experiences that lead to agency, a clear sense of self and a core set of transferable competencies that are of value to the learner and others.
How Students Apply
By the end of the pilot, we learned that there are some key factors worth further examination when designing a scalable quality assurance system with multiple learning providers.
- First, providers may participate in the system only after a careful screening process that examines the validity of the assessments they use and the robustness of their data and reporting infrastructure.
- Second, learner advocates--already the lynchpin and biggest innovation in the emerging ReSchool system--emerged as key, yet again. Learner advocates have a distinctly different role than what we see in our traditional school systems. Most importantly, the advocates work directly in partnership with learners and their families, not for the system. They are not attached to any particular institution or provider. Instead, their job is to cultivate relationships with the learner and family, use learner profile tools to assess strengths, needs and aspirations, and then help connect learners to the appropriate and best mix of learning experiences to advance them in their progression through the learner framework.
In the summer pilot, we realized that the advocates are vital for ensuring equity and access and collecting good data that informs the creation of provider scorecards. Learner advocates must collect data from learners to create and update their profiles; collect assessment data; administer end-of-experience surveys; and meet one-on-one with learners afterward to gain qualitative information to ensure that the experiences they have are valuable.
- Finally, we recognize that a system such as the one we are designing requires a robust mix of technology, information management, and communication tools so that the learners and those supporting them can do their jobs. Thus far we have used off the shelf, fairly simple tools for our prototypes. As the work scales, however, we anticipate we will need more complex technology tools that we will either need to build or find in the marketplace.
The next step for ReSchool is to further test what we learned in this initial prototype with an expanded base of learner, educator, family, and community partners. The ultimate goal is to use public policy to create the system we are building.
Shifting from a purely top-down accountability system isn’t easy work. But we foresee a day when quality assurance encourages learning providers to improve their experiences and helps learners and advocates better select the right programs, rather than just deliver autopsies of learning and drive punitive actions. That’s a critical transformation for our education system, and a transformation that screams loudly for willing risk-takers and designated spaces to innovate.