Quantcast
728 x 90
728 x 90
728 x 90
728 x 90
728 x 90

Education Sector Report Adds Valuable Perspective on Colorado Growth Model

The first time I heard of the Colorado Growth Model, I thought maybe it would be a scientific system to help determine how tall I would grow up to be in our high-altitude environment. No, we’re talking about our state’s system for measuring student progress toward proficiency in math, reading and writing, sorted by district and school. So I was more than just a bit off. You could sue me, but it wouldn’t get you very far.

Anyway, the reason I bring up the topic is a brand-new Education Sector report titled Growth Models and Accountability: A Recipe for Remaking ESEA. The report’s hook and chief case study is Denver’s Bruce Randolph School, and a significant chunk of the report is focused entirely on (you guessed it) the Colorado Growth Model. That’s why my Education Policy Center friends gave it such close attention. Co-author Kevin Carey was kind enough to spend a few minutes on the phone with Ben DeGrow to explain a few things and answer some questions.

It’s safe to say the authors of the Education Sector reports are high on the Colorado Growth Model as exemplary for other states to follow. As the report notes, a consortium of 14 states has inexpensively done just that, thanks to Colorado’s use of open-source software to display the data for public consumption. Carey and co-author Robert Manwaring gave our state’s growth model lofty praise for user-friendliness and accessibility:

Colorado stands out for the ease with which policymakers, principals, school board members, parents, and other stakeholders can access the information.

Interestingly, while more accessible than in its first iteration, SchoolView.org (on which Colorado Growth Model data is hosted) still has some progress to make in this area. My Education Policy Center friends haven’t been as keen on the site’s user-friendliness, but it was a good opportunity for some conversation.

Carey observed the “dilemma” of wanting to keep the academic performance information simple, but noted that seeking to make it too easy can lead to a “simplistic” model that may be lacking in fairness and accuracy. He also acknowledged that, largely because of different concerns and questions, the “best way to present data for parents is not necessarily the same as for policy makers.” That may be a big part of the rub.

I’m still hoping for a way to identify school performance that presents the information in a manner that a broad swath of parents can understand while maintaining the Growth Model’s integrity. You know. Something that sort of combines the wonderful School Choice for Kids website’s yin with SchoolView.org’s yang. I’m young. I can dream big.

In the course of the conversation, Carey made a salient point: “None of this information is good unless you use it to make schools better.” Getting good, valid information about school performance is crucial, but how do we go about making them better? Keep reading the blog, as we go on striving to figure that out together.