Overarching purpose of the workgroups is to respond to the question:
“What are the human and technical challenges to – and opportunities for – the development, use, and scaling of a next generation of indicator and response systems aimed at increasing post-secondary persistence and success for all students? How shall these be addressed?”
Overarching desired outcomes and products of the workgroups:
The workgroups will make recommendations to the initiative for developing a framework organizing the next generation of effective student success indicator and response systems. The framework will bring consistency and quality control to the emerging field by codifying effective practices and lessons learned from existing users and the research base, so that new users can base their efforts on shared definitions and accurate understandings. Workgroups and staff will curate implementation tools, resources, and a glossary for the next generation of indicator and response systems aimed at increasing both high school graduation and college and career access and persistence.
- What are the best ways to represent indicator data for various users (e.g. educators, parents, students)? What visual strategies are most likely to encourage appropriate use of indicator data? What general principles should govern the design of indicator data displays?
- What technology constraints do states and districts face in developing usable data dashboard and indicator displays? What data barriers exist (e.g. data availability, existing structures to merge relevant information together for indicators and outcome measures, software for indicator testing and development)?
- What lessons can K-12 education learn from business and higher education information systems and data displays? What lessons can be learned from higher education on how to effectively use “big data” to improve student outcomes? How can the field continuously learn and use feedback to improve data displays over time?
- How should districts display information on interventions/responses? How should the efficacy of previous responses be assessed and displayed?
- What are the best ways to disaggregate indicator data? What cuts of the data are most useful for which users? How important are drill down/analysis capabilities for the typical indicator system user?
- What information is best displayed by state-level indicator systems? What information in district-level indicator systems? What about networks and other intermediary organizations that work with groups of schools? How can states and others help low capacity districts/schools create usable real-time data dashboards/displays?