I’m writing this blog during a 17-hour road trip that was supposed to be a 2-hour flight. My husband and I wanted to take our two boys skiing for spring break and have spent the last several months meticulously planning the perfect trip. I booked all of the fun activities, and he dusted off our equipment and bargain shopped for winter clothing. We found the perfect, non-stop flight that would minimize the frustrations of traveling with two young kids. And then mother nature brought the 4th largest snowstorm on record to Denver, Colorado. Thousands of flights were canceled, including ours. Lucky for us, my husband’s superpower is that he can quickly pivot to accommodate change. Without hesitation, he loaded our already packed suitcases into the car, strapped the skis to the roof, and here we are, well on our way to the vacation we hoped for.
For more than a century, standardized testing data have been used to measure the success of students, teachers, and schools - and even to mark our global competitiveness or lack thereof. These data have driven significant education policy and funding models at the national and state levels, and school districts devote up to 15 percent of their instructional days each school year to student assessments alone, costing an estimated $1.7 billion each year. The political and financial commitment to standardized testing was born out of good intentions. The incredibly high stakes for students, teachers, and schools that were tied to these data were intended to hold us accountable for educating all children. But the return on these investments is debatable at best. We know now that standardized testing data, when viewed in isolation, represent a limited view of student success and can even mislead us into making discriminatory decisions because of their inherent biases. We know the policies we’ve enforced and the decisions we’ve made based on these data have failed to close persistent achievement disparities across income levels and between white students and students of color, even after 50 years of trying.
Subscribe to the blog to get your free copy of our Personalized Learning Playbook. A Playbook that will help you make the case for personalized learning, and reflect on the important elements to take in consideration.
On January 28, 1986, the space program experienced one of its most catastrophic events to date when the Space Shuttle Challenger broke apart just over a minute after launch. All seven crew members died, including Christa McAulliffe, a school teacher who would have been the first teacher in space. If you’re familiar with the event at all, you know the accident was caused by a failed O-ring seal in the solid rocket booster. What’s less widely known is that, according to the recently released Netflix documentary, Challenger: The Final Flight, NASA and the company that manufactured those O-rings had information available to them that day that could have led to a different outcome. For example: The O-rings were a known problem. In many of the successful launches using solid rocket boosters prior to the Challenger, there was evidence of damage to the O-rings during launch. The temperature the day of the launch was much colder (by at least 20 degrees) than typical launch days. More than one expert at the O-ring manufacturer voiced concern that the part had not been tested at that temperature and could fail. NASA made choices about the data they used that day. They went into their decision-making process with a bias (they were motivated to launch after a series of delays), and they failed to see how that bias motivated their choices and in turn influenced their behavior. In education, we make choices about our data, too.
When I was studying research methods as part of my doctoral degree, the running joke among our professors was that they would answer every question with “It depends.” My favorite professor would answer an either/or question with “Yes.” Should I use a survey to answer these research questions? Or would interviews be better? “Yes.” So when school district leaders ask me if their data should drive their strategy, or if they should define their strategy (goals, priorities, actions) and then make decisions about data based on their strategy, my favorite answer is “Yes.” Because honestly, it’s both. Here’s what I mean.
Through most of the spring and summer, we at Education Elements have intensely focused on helping school districts prepare for returning to school. As we’ve gotten closer to the start of school, and school leaders return to prepare their campuses, one of the most common questions we get is how to think about instructional staff assignments when some students will be learning remotely and some will be onsite. To explore this topic further, we convened a group of school and district leaders in Texas to participate in a design sprint. Here’s what we learned:
The best way to find out if you can trust someone is to trust them. - Hemingway February 10th was my first day as Managing Partner at Education Elements. On March 11th, 30 days into my new job, I was on the phone with our CEO making critical decisions about our response to the exploding Coronavirus crisis.