Saturday, July 22, 2006

Blogging got overtaken by proposal writing

Our value-added research group has been working on a number of proposals linked to various U.S. Department of Education and a couple of private foundation deadlines. It just so happens that they all fall in the latter half of July. Several are out the door and several more are due this coming week. We are proposing more work that links inputs (financial and human resources, professional development, curricular materials, etc.) with outputs (classroom practices, student tests, grades, attendance, etc.) in one urban district. This work would tie together ongoing school- and grade-level value-added analysis of test with data to other information systems that have traditionally been kept separate.

On the national front, we are bidding with one of applicants for the technical assistance center that will support the U.S. Department of Education's Teacher Incentive Fund. It's a bit of a pig in a poke since we don't know who will be awarded the TIF grants. So, there's no telling how challenging the work will be. We do know from years of experience that getting the models right is brutally hard. For example, if testing is done in late fall or early spring, growth cannot be simply assigned to a single grade or school. The credit for growth has to be apportioned proportionally to the two grades in which it occurred. Other issues such as how to handle mobile students and teachers and changes in test forms also complicate models. Students retained in grade present a challenge for growth modellers. If a student is retained in fourth grade and takes the fourth grade test again, what is the growth one would expect from the 4th grade to 4th grade?

If we win even half of the work proposed, we will have new windows into aspects of schooling and management that will advance our understanding of how best to support decision making at all levels of the educational system.


Tuesday, July 18, 2006

Indiana wants test scores measured over time

Like many other states, Indiana wants to move to some sort of growth or value added modeling to keep more states off the failing list. We've been looking at this at Wisconsin Center for Education Research. We recently evaluated several of the growth models proposed by states in a series of brown bag sessions this summer. We used longitudinal research data we have in house to impose various state models on the same data set to get an apples to apples comparison. One of the clear factors common to most of the models is that they decrease the percentage of schools identified as failing by 10-12%. This is only a year one effect, however. By 2-3 years out, if the high growth rates predicted are not met (and that's not to surprising), all states are back at high percentages of schools failing. So, the current flurry of activity may only buy 2 or 3 years of coverage.


Saturday, July 15, 2006

Ed Trust takes a look at teacher quality

Just in case you wondered how much teach quality matters - particularly vs exposure to advanced curriculum, Ed Trust has some sobering findings. In a report published on July 8th, the authors (Heather G. Peske and Kati Haycock) provide some scary numbers that reflect things we've heard before like - students in schools with high percentages of poor and minority students are twice as likely to have novice teachers. They are also more likely to have teachers teaching out of their primary subject area/area of certification. This was a particularly scary quote:
[T]here were stunning differences in levels of readiness according to the quality of teachers in a school. In schools with just average teacher quality, for example, students who completed Algebra II were more prepared for college than their peers in schools with the lowest teacher quality who had completed calculus.
These numbers are the sort of thing that would make me want to go hide under the bed.


Thursday, July 13, 2006

Just in case you were wondering why realtors support school report card sites

Here we have a study that shows just how much house values in crease with a 20% increase in students meeting proficiency - 7%. It seems though that the authors have a funny take on value-added reporting when they study its impact on house values. Just looking at the change in proficiency rates between 4th grade and 9th grade cohorts isn't value added, since you don't know if it is the same kids or any of the qualities of the instrument.

Time to go back to school.


Tuesday, July 11, 2006

Missouri struggles to show that new tests meet NCLS quality standards

Missouri administered new tests in grades 3-8 this spring and must now provide technical data to show that the tests meet test quality requirements set forth in NCLB. Maine and Nebraska are the only states that have had tests disallowed. In Maine it was the use of the SAT as the high stakes test for hight school students. Illinois is likely to encounter similar feedback as it attempts to use the ACT for high school assessment. NCLB requires that assessments be aligned with state standards. It would be hard to argue that national college entrance exams reflect local standards.


Monday, July 10, 2006

Discrepancies in graduation rates in the news

A recent Editorial Projects in Education Research Center report published by Ed Week on graduation rates points out discrepancies in state graduation rate calculations (by state) and shows the status of efforts to implement a nationally-accepted definition of graduation rates proposed by the National Governors' Association. What the report (and the article linked to this post) reveal is how hard it is to get agreement on, define, and implement what sounds - on the face of it - to be a fairly simple concept. Image the difficulties surrounding dropouts and who gets the "credit" in a high stakes system for that drop out. The last school? What if the student was only there a week? Who gets the credit for a graduation if the student spent most of his or her time at one school and then switches to another, less effective school in the last semester?

These difficulties are all over high stakes data analysis.


Friday, July 07, 2006

Not a simple story, but evaluators have to love this headline

"State hurting education by not funding data collection"

Hewlitt Foundation Education Program Director Marshall (Mike) Smith and Hewlitt Program Manger Kristi Kimball have some fairly strong words for the legislature's failure to provide adequate funding for the collection of high quality district data. This is one of the problems with which every state in the nation needs to grapple. The siloed organizational structures of school districts and state educational agencies are a manifestation of the compartmentalization of funding and accountability from both state and federal agencies. Decades of developing stove pipe reporting capacities are fundamentally inadequate for the task of addressing questions about "what works" in education. The effectiveness of complex social phenomena, such as effective educational practices for particular communities, are difficult to measure under the best of circumstances. In education we have data systems designed for different purposes and staff who have always been rewarded for hoarding data and reporting up - not using the data.

The inability of state systems to address "bang for the buck" questions continues to stymie legislatures. After years of building state and district capacity, California seems to have snatched defeat from the jaws of victory.


Wednesday, July 05, 2006

Michigan jumps into the Pay for Performance fray

Michigan seems poised to join Minnesota and Colorado in using pay increases tied to school level performance as an incentive for teachers. As the reporter notes this comes in the wake of an announcement by the Bush Administration to provide $500 million in supplemental funds to support pay for performance plans. It looks like value-added evaluation capacity might be a highly sought set of skills.


Monday, July 03, 2006

Florida sums up its options under NCLB and 2007 testing

Florida, like a number of other states, is looking at a substantial number of schools (535) failing to make AYP targets for 5 years running next year. If it happens again, they go under automatic restructuring. However, like many other states, there are a number of schools in this list that are successful in most areas and are only missing AYP in one area. There are no exceptions however. Schools missing AYP at all are subject to restructuring.

This dynamic is one of the reasons that Florida (and most of the other applicants) applied for the U.S. Department of Education's growth model experiments. Under a growth model that predicts eventual proficiency, students who are currently under AYP, but can be predicted to achieve proficiency in the future, can be counted as proficient.

It is no wonder that states with the capacity will want to take this path. It does not obviate the need to achieve 100% proficiency, but it puts the extreme sanctions into the more distant future.


Saturday, July 01, 2006

A Policymaker's Guide to the Value of Longitudinal Student Data

This posting comes from 2002, but these recommendations remain solid policy recommendations. The disturbing thing is how far many states and districts still are from making this a reality.