Thursday, March 30, 2006

Schools 'forced to behave like supermarkets'

The UK has moved to value added reporting of secondary school outcomes. What the author seems to mean with the title of the piece is that the pressure to "play the game" will encourage secondary schools to push students to take more challenging exams and provide less support to their feeder primary schools. If the primary schools do relatively worse, the secondary school will be shown to be adding even more value.

This is exactly the soft of perverse incentive structure that opponents of value-added measures fear.

Chris

Tuesday, March 28, 2006

Diagnostic assessment and learning

One of the issues we are tracking as a part of a "value-added" approach to educational improvement is the role of diagnostic assessments. There is surprisingly little written about the role of interim/formative/diagnostic assessments other than as useful markers for identifying gaps in student knowledge that need to be addressed in order improve performance on high stakes assessments. However, diagnostic assessments also provide feedback to teachers about how well they are doing implementing their curriculum - providing ongoing professional development.

Chris

Saturday, March 25, 2006

NCLB backlash

This piece does hit most of the points that opponents of NCLB list as it faults. It also points to pressure states are putting on the U.S. Department of Education and their respective congressional representatives. State-level politicians will not be able to support a system that identifies schools seen as broadly successful .

Chris

Thursday, March 23, 2006

NGA Center for Best Practices

On February 2nd and 3rd, 2006 the NGA Center for Best Practices co-hosted a conference entitled "By the Numbers: a National Education Data Summit". The other co-sponsors were, the U.S. Department of Education, the Florida Department of Education. Other partners included the Data Quality Campaign, Bill & Melinda Gates Foundation, Lumina Foundation for Education, Alliance for Excellent Education, and the Florida Channel, WFSU.

The stated purpose of the meeting "was to develop a shared vision for effective, comprehensive K-16 data systems and how policymakers can use data from these systems to develop policies to improve educational outcomes." Speakers were drawn from the partners as well as national labs, universities, and state educational agencies. The topics included a wide range of issues - from mapping data linkages between systems to exploring why similar looking school perform differently.

The overwhelming sense I get from this fairly comprehensive that list is that folks are skirting the "e" word - evaluation. Questions of "what works?" or "what is the most effective strategy?" are not naturally addressed by operational data collected by most school systems. One of the difficulties we are encountering is the lack of appreciation for - or even an understanding of the requirements of - evaluation.

Chris

Wednesday, March 22, 2006

Commentary on Illinois resetting proficiency cut scores on ISAT

Illinois, like many other states, regularly resets cut scores for establishing proficiency levels on state tests. Wisconsin's Department of Public Instruction provides a resource page that describes the procedure used in Wisconsin to set cut points (done in 1997 and again in 2003). It also provides references to many other states that use the same procedure. There is little doubt that stock high stakes tests don't map well onto learning standards set by individual states. T

here is a tension between appropriate "scope and sequence" and what the test measures. Some states may require local history to be taught in grade 6, but the externally purchased social studies exam focuses more on U.S. history. The proficiency scores would need to be set that provided a fair report on students' opportunity to learn the content on the test.

This article points to the other tension in this setting. There is a natural incentive for group setting cut points to make NCLB requires "a little more achievable" by lowering expectations. Even if that is not the intent, the threat of high stakes can make outsiders question the motivation of committees charged with this work.

This is part of the price we pay for federalism. Local control at the state and local level means that local constituencies have more control over what gets taught - even as they have less control over what gets measured.

Chris

Monday, March 20, 2006

Colorado's Education Commissioner takes stock of where we are with high stakes testing

Colorado's Education Commissioner William Moloney declared that standardized testing and longitudinal analysis have at last established a beachhead in Colorado. He does a good job showing how education reform based on rigorous testing was actually introduced by several Democratic governors back in the 1990s. Bush, in his words, "trumped them all" with NCLB.

The point I take from all of this is that this stuff is really hard. We are nearly a decade past the "call to action" he cites and most states still cannot tell what courses individual students have taken or what teachers taught which kids. High stakes tests currently only hold kids accountable in most states and districts. Getting to the place where we can see what works to close gaps and provide predictive support for change will take a considerably more work.

Chris

Friday, March 17, 2006

WCER Value-Added Research Center lays out its work plan

The Tri-State Longitudinal Data System project had its formal kick-off on February 21st and is working with the program office at the NCES and the project managers in the three states to carve out the best opportunities for collective action and sharing.

It has become clear that one of the things we (WCER staff) need to do is to write. One of the important deliverables for this project is to disseminate what we know - what works, what doesn't work, what we cannot do by ourselves, etc. This working paper is one example. It was drafted for a UNESCO-sponsored International Federation of Information Processing (IFIP) working group meeting. Chris Thorn, LDS co-Principal Investigator, is on the executive of the Working Group on IT in Educational Management. This group of scholars and practitioners from around the globe meet every other year to present their latest work, network with researchers doing parallel work in other settings, and explore opportunities for international collaboration.

The working paper provides a quick overview of the LDS work and a quick assessment of where knowledge management technologies might play a role in improving educational decision making.

Chris

Wednesday, March 15, 2006

Enhancing Predictive Analytics in the Enterprise Through Location Intelligence

This is a little blue sky, but given good data one could imagine doing predictive analysis of locating a new school (or redistricing). Given good value-added models for classrooms and students it would be possible to play out scenarios that model school performance.

Chris

Monday, March 13, 2006

Trade-offs in security, performance, and ease of use in centrally hosted student information systems

This piece lays out a number of issues but most clearly hits the tradeoffs between locally and centrally hosted systems. The impact of differing security policies also arises. Differing remote access and password policies actually rob much of the functionality of the system - forcing one district to move to local hosting of the data system. This mismatch in security policy eliminates one of the most important aspects of the system - remote access.

Alignment of policies and a clear understanding of the payoffs and costs at each level of the organization needs to be in the forefront for all of the players throughout the project. This clash of policies and needs should not have been a surprise.

Chris

Saturday, March 11, 2006

LPA/NCREL contributions on value-added analysis

I cited RAND's work on value added analysis last month. Another group that is looking at both the technical and policy implications of doing rigorous evaluation of student learning is NCREL/Learning Point Associates (aka Castor & Pollux of education services in the Midwest).

Also see volume 16 of their policy issues publication for more on state-level educational data systems. This is particularly interesting because it outlines the information needs of different groups across the educational system - from state policy makers to parents and community members.

Chris

Thursday, March 09, 2006

Total Cost of Ownership in a Data Warehouse

Even though it is a couple of years old, Ralph Kimball's piece on the cost of ownership and the focus on the end user.

Kimball's top 3 points are enough to delay or derail any data warehouse project:
  1. Data needed for decisions is unavailable
  2. Lack of partnership between IT and end users
  3. Lack of explicit end-user-focused cognitive and conceptual models
Number one is the big stumbling block state education agency folks face. The barriers between silos make it difficult to know what data is already available (or if available, the right to use the data may not be clear). Even in those areas in which data is available at the individual level (student assessment, special education, vocational programs, etc.), there may be no way to tie the data to other programs of interest since data for many programs is only collected at the aggregate level.

The second barrier is one that is begin overcome by the demands of NCLB. The requirements for testing all students, reporting on subgroups, etc. are pushing programs to share data and ask hard questions about impact and professional development payoffs. This sense of urgency to figure out what works, may be the most long-lasting impact of NCLB on educational systems.

The third barrier is one of sensemaking. How do those responsible for making program decisions make tough decisions? In the past, there has been very little dialogue between state-level program managers and regional service providers or local district staff. One of the specific goals of the Longitudinal Data System grants is to do needs assessment and requirements gathering from across all levels of the education enterprise.

Chris

Tuesday, March 07, 2006

Proactive Data Quality - Demming revisited

DM Direct Newsletter , February 24, 2006 Issue, by Ken Karacsony reminds us that the stuff Demming told us about auditing for defects - not an efficient use of resources - remains true with the product is information/data quality. Quality has to be the goal of everyone on the job. Detecting data anomalies and sending them back for "cleaning" to the unit (in our case a school or district) is not an efficient approach. It does not address the root cause of the quality problem.

Inspection does not improve data quality; it only tells you that there is a problem. Cleansing the data after the fact does not remedy the problem - it only masks the problem. Companies are spending millions of dollars on initiatives to detect and cleanse data rather than applying the resources to actually improve the quality of their information. The best way to improve data quality is to produce quality data.
Data quality has to be part of individual accountability. It has to be sold as an efficiency issue. It has to be sold as part of doing a quality job.

These principles can and should be applied to school data.

Chris

Monday, March 06, 2006

School CIO suggests doing a data makeover (registration)

This seems a little crazy to me. School CIOs should be the last people one needs convince that data quality is the most vital component of school decision support. If this is the level of a magazine pitched at "CIOs", the systems being fielded must be pretty scary.

Chris

Friday, March 03, 2006

Why don't we have buyers' guides for higher education?

USA Today reports on DoEd Spelling's concerns about picking a college for a child. There's loads of information on the ammenities and social life, but almost nothing on how successful the school is on helping students learn, graduate, and get good jobs or graduate school placements.

This concern suggests that parents are going to expect value-added assessment from colleges and universities at some point. Indeed, there are some suggestions that congress may step in and require this.

Chris

Thursday, March 02, 2006

What a great guy - and smart too.

shameless plug. I've been involved in the International Federation for Information Processing Working Group on IT in Educational Management for the past 5-6 years. This is my second piece to come out in their edited volume. It addresses the role of web-based collaboration tools in supporting the work between distributed partners in systemic education reform.

Chris