Tuesday, February 28, 2006

Working with growth data can be scary

Pennsylvania Value Added Assessment System is delivering assessment results to the approximately 100 districts participating in the optional program. Scranton's special projects director responded to requests for data with the following quote, "I'm not really comfortable quoting statistics," he said. "We haven't really learned how to work with the data." This comment suggests that the difficulty of training teachers and administrators to use more sophisticated is likely to be substantial. These comments come from a professional who has dealt with annual test results before. As the project director for the district, he finds them overwhelming. Just imagine how classroom teachers or parents must feel.

Chris

Monday, February 27, 2006

Social Network Analysis in VARC Design

One of the problems we run into when working in complex silos is that connections between people, data resources, and analytical needs are not always obvious. It's also the case that solutions to important policy problems often run at odds with the bureaucratic structure. Social Network Analysis can provide a window into the non-obvious paths within (and between) organizations.

The following questions are typical of a network survey done as a diagnostic:
  • To whom do you typically turn to for help in thinking through a new or challenging problem at work?
  • To whom are you likely to turn to in order to discuss a new or innovative idea?
  • To whom do you typically give work-related information?
  • To whom do you turn to for input prior to making an important decision?
  • Who do you feel has contributed to your professional growth and development?
  • Who do you trust to keep your best interests in mind?
The answers can be used build a network understanding of the organization that gives a very different picture of how people get things done outside traditional pathways. It can also show barriers that are not apparent within silos.

Chris

Saturday, February 25, 2006

State educators drop IBM contract

This is an interesting note of caution for firms doing development of transactional and longitudinal systems with state partners. States have the capacity to just walk away from deals. It is often a political necessity. Given the costs associated with large external contracts, it can be politically expedient to take over large projects. I think we will start to see many states and larger districts drop expensive data warehouse and decision support tools in favor of roll-your-own solutions using Oracle or Microsoft tools. The labor costs for out-year maintenance and updates will be far lower. More importantly, state agencies are figuring out that it is critically important to have the human capital to design and build these tools in house. Those skills are a crucial part of organizational improvement efforts. Outsourcing that work can create a critical skill gap.

Chris

Thursday, February 23, 2006

Selection of Growth Model proposal reviewers announced

The U.S. Department of Education has announced the names of the peer review group who will make recommendations to the secretary on which states should be included in the growth model demonstration group.

Chris

Wednesday, February 22, 2006

Balanced Scorecard in Educational Reform

This is an interesting presentation from the CPRE conference in November 2004. The authors present a very interesting overview of the incentive structures provided to teachers and building leaders. These performance incentives are tied to metrics clearly spelled out in the district's balanced score card. There are also recuitment and retention incentives.

This may be the kind of enviroment in which value-added analysis could flourish. It would also be an environment that could lead to the the abuse of such analysis if it was applied inappropriately to individual professionals - violating the assumptions of the underlying models.

Chris

Monday, February 20, 2006

Data warehouse benchmarks

Recent article from DM Review reports survey of 454 firms engaged in different forms of business intelligence and data warehouse implementations. In particular, the authors focused on the success characteristics of different implementation models. They surveyed 20 DW experts go get a set of metrics. They characteristics fall out into the following categories:

Product Measures
  • Information quality: The data warehouse should provide accurate, complete and consistent information.
  • System quality: The data warehouse should be flexible, scalable and able to integrate data.
  • Individual impacts: Users should be able to quickly and easily access data; think about, ask questions, and explore issues in new ways; and improve their decision-making because of the data warehouse and BI.
  • Organizational impacts: The data warehouse and BI should meet the business requirements; facilitate the use of BI; support the accomplishment of strategic business objectives; enable improvements in business processes; lead to high, quantifiable ROI; and improve communication and cooperation across organizational units.
Development Measures
  • Development cost: The cost of developing and maintaining the data warehouse should be appropriate.
  • Development time: The time to develop the initial version of the data warehouse should be appropriate.
Respondents were asked to respond to a series of detailed questions that probed how successful their efforts had been on the measures described above. One interesting finding was that three of the 5 architectures included in the study scored almost identically. Responding to the following list:
  1. Independent data marts,
  2. Bus architecture with conformed dimensions (bus architecture),
  3. Hub and spoke (i.e., Corporate Information Factory),
  4. Centralized (i.e., no dependent data marts), and
  5. Federated.
Respondents identified independent data marts as the least successful strategy, followed by federated models. The remaining models all scored equally well across the success measures. This goes right to the heart of data warehouse architecture wars. The data does not seem to support arguments that any of the extreme positions on proper DW architecture are well founded.

Chris

Saturday, February 18, 2006

Higher Ed and lifting student achievement

The Association of American Colleges and Universities journal Peer Review published a piece by Carol Geary in 2002 that outlines what would be required to implement value-added analysis for measuring growth in student knowledge across a 4 year education. The primary requirements would be a consistent understanding of the core curriculum required for a bachelor's degree and a move to performance testing that would show growth of knowledge.

One of the ironies of undergraduate education is that colleges use placement tests across the student body to test for basis skills or for advanced placement early in the process and then not at all after that. There is no capstone test.

We have a similar situation in high schools in many states. There are tests in 9th or 10th grade that measure whether student are performing at the required level, but no efforts to measure what student know when they graduate. The value added by high school is mostly unmeasured.

Chris

Friday, February 17, 2006

Columbus, Ohio Public Schools Considers Value-Added Teacher Compensation

The school district and the teacher's union have a draft memorandum of understanding that would place all newly hired teachers under this new pay plan. It would also allow any existing paper to opt in. There is a notion of career steps in the value-added compensation model, but it is not the lock step of the current system.

Chris

Wednesday, February 15, 2006

Can test vendors keep up with NCLB demands?

It does seem that likely that NCLB demands on test vendors has outstripped their capacity. It is difficult to make tests that stretch high achieving students. Creating high quality tests is expensive and difficult. Sometimes known as the ceiling effect, the inability to measure student learning at the high end can artificially create a situation in which it appears the the gap between two groups of students is closing. Since the upper end of the distribution of scores is bounded, the mean for a higher performing group cannot move up as far or as easily as a lower performing group.

Chris

Tuesday, February 14, 2006

Tutor Program Offered by Law Is Going Unused - New York Times

Students in schools identified as failing under NCLB are eligible for free supplemental services. Across the country many districts and states have very low numbers enrolled, despite fairly aggressive advertising. Lack of facilities, qualified instructors, and many other reasons are cited as possible barriers.

One other important aspect of the support services efforts is mentioned only briefly. There is very little evidence about what is working in support services. There is no serious evaluation of going on. Researchers at the Value-Added Research Center at UW-Madison are working with Milwaukee Public Schools to design and implement an evaluation of these services to help district and parents chose the most effective mix of supports.

Chris

Sunday, February 12, 2006

Value-Added Assessment in Higher Ed

Richard H. Hersh, Senior Fellow at RAND, delivered a paper at the AAHE National Assessment Conference, Denver-June 15, 2004 entitled Assessment And Accountability: Unveiling Value Added Assessment In Higher Education. In the paper Hersh does a good job laying out the rationale for rigorous assessment in higher education. He does a good job anticipating and addressing many of the concerns colleges and universities are likely to have with such a proposal. In the paper Hersh deals with:

  • Phase I: Experimentation, Incentives, and Rewards
    • Faculty need a chance to address assessment efforts as a scholarly effort. Pick a program to evaluate and use the process to help faculty work throught the entire process of creating expectations, implementing the curriculum as intended, and measuring outcomes consistently.
  • Phase II: Development and Diffusion
    • Transparency in programs and outcomes could be particularly important to state-funded universities. In states with tight budgets and declining support for higher education, clarity around the value added by university and college education could be a powerful force for supporting continued investmente. (The same goes for K-12 education.)
  • Phase III: Comprehensive Assessment System Development and Implementation
    • Assessment in general education and within majors collected within institutions. Samples of data could also be shared across institutions (within systems for example) to allow a school to evaluate how a particular program implemented elsewhere might affect outcomes locally.
  • Phase IV: Value Added Data Used to Inform Institutional and State Policy
    • Faculty reward structures (particularly in non-research institutions) could be adjusted to reward programs and practices that delivered higher student value-added outcomes. It would also be possible to evaluate different institutional forms (traditional liberal arts, more applied programs, virtual schools, etc.).

Chris

Thursday, February 09, 2006

Looking for Scholarly Support for Value-Added Assessment

There is still a small body of work on the appropriateness of value-added assessment. AERA's policy series Research Points published a piece in the summer of 2004. The recommendations are in line with any reasonable methodologists thinking understanding. Value-added assessment is likely to be more fair to all involved, but it is still tough to make high stakes decisions for individual teachers and kids given the "uncertainty inherent in measurement".

Chris

Tuesday, February 07, 2006

Data overloading and model development

Having worked with several large districts (and more recently several state programs), it clear that data overloading (additional non-standard field values) is a common problem. DM Review describes the problem this way:
If a database has not been defined with all knowledge workers' information requirements, and that database is not easily extendable, knowledge workers will often use an existing field for multiple purposes.
A common example is the Free/Reduced Lunch program participation field. A program administrator at the district or state level needs to report out the total number of children in each category. Legitimate values for the field may be "F" and "R". However, one office in the district is charged with adjudicating cases in which the family is close to qualifying or mistakenly enrolls when they are not eligible. They are responsible for tracking those denials and enter the letter "D" in the Free/Reduced Lunch field to allow them to run reports at the end of they year. This use of the variable was never anticipated in the new student management system and is purged when the data is loaded to the student data warehouse - erasing data vital to the group who entered the values.

Data overloading is one of several data quality issues that will have to be confronted as districts and states move from reporting annual and aggregate data to longitudinal analysis of individual-level data.

Chris

Sunday, February 05, 2006

Data Quality Campaign and Data System Goals

Fundamental in designing a longitudinal data system

It is clear that these elements are necessary but not sufficient for a robust longitudinal data system. Listed below are other fundamental issues to address when designing a longitudinal data system:
  • Privacy Protection: One of the critical concepts that should underscore the development of any longitudinal data system is preserving student privacy. An important distinction needs to be made between applying a "unique student identifier" and making "personally identifiable information" available. It is possible to share data that are unique to individual students but that do not allow for the identification of that student.
    • These practices are well understood outside of education. This is probably the easiest barrier to overcome.
  • Data Architecture: Data architecture defines how data are coded, stored, managed, and used. Good data architecture is essential for an effective data system.
    • It would be tempting to simply adopt dictionary standards being developed by the SIF group, NCES, or other standards groups. What this misses is the unique accounability and political organization of each state. One size will not fit all states.
  • Data Warehousing: Policymakers and educators need a data system that not only links student records over time and across databases but also makes it easy for users to query those databases and produce standard or customized reports. A data warehouse is, at the least, a repository of data concerning students in the public education system; ideally, it also would include information about educational facilities and curriculum and staff involved in instructional activities, as well as district and school finances.
    • This is still a relatively new area of work in IT. Many of the turn-key "warehouse" systems are still driven by a compliance reporting mindset that only integrates data for reporting up - not for cross program analysis and improvement.
  • Interoperability: Data interoperability entails the ability of different software systems from different vendors to share information without the need for customized programming or data manipulation by the end use. Interoperability reduces reporting burden, redundancy of data collection, and staff time and resources.
    • SIF is the leading effort in this area. Again, the reduction of burden is very likely the best argument to use for leveraging bureaucratic resistance. The long term payoff will be the ability to study program effectiveness. The increases in student learning, professional development alignment, and feedback to teacher education instiutions will likely be far greater.
  • Portability: Data portability is the ability to exchange student transcript information electronically across districts and between PreK-12 and postsecondary institutions within a state and across states. Portability has at least three advantages: it makes valuable diagnostic information from the academic records of students who move to a new state available to their teachers in a timely manner; it reduces the time and cost of transferring students' high school course transcripts; and it increases the ability of states to distinguish students who transfer to a school in a new state from dropouts.
    • This is a long term goal. Most districts in Wisconsin still have no consistent transcript data online.
  • Professional Development around Data Processes and Use: Building a longitudinal data system requires not only the adoption of key elements but also the ongoing professional development of the people charged with collecting, storing, analyzing and using the data produced through the new data system.
    • This will be the largest area of cost. The resources need to train SEA, school and district staff dwarf the costs of developing the systems.
  • Researcher Access: Research using longitudinal student data can be an invaluable guide for improving schools and helping educators learn what works. These data are essential to determining the value-added of schools, programs and specific interventions.
    • We only now entering an era in which research about what works in schools can be done at scale. This work has the potential to transform education and education research.

Chris

Thursday, February 02, 2006

Implications of a Service Oriented Architecture

One of our partners on the Tri-State Longitudinal Data Systems project (Minnesota) has already begun to implement an agency wide initiative based on a Service Oriented Architecture. Their data dictionary site also reflects their approach to standards and interoperability. All dictionary elements are publicly available as are the training and internal marketing presentations used to educate and energize agency staffers. Wisconsin is moving in the same direction. The Department of Public Instruction will be able to leverage much of the work done by Minnesota. This collaboration will free Wisconsin to take the lead on other areas of work. It is the culture of data stewardship and data quality - in use - that will be one of the most important contributions of this effort.

Chris