The National Academy of Education (NAEd) released a new report examining the strengths, limitations, and complexities of international large-scale assessments (ILSAs), especially when used as a basis for informing educational policy and practice.
Results from ILSAs (PISA, TIMSS, etc.) often receive considerable attention, raising questions — and in some instances alarm — about whether a nation’s students are prepared to compete in a globalizing economy. Although there is widespread recognition that ILSAs can provide useful information — and are invaluable for mobilizing political will to invest in education — there is less consensus among researchers about what types of comparisons are the most meaningful and what could be done to assure more sound interpretation.
To address these issues, the National Academy of Education (NAEd) assembled an expert steering committee to examine the future of ILSAs from a variety of disciplinary perspectives. The committee held two workshops, commissioned a series of papers, and has now produced a summary report focused on the design, interpretation, and policy uses of ILSAs.
As explored further in the report, ILSAs can provide useful information to spur educational reforms but only if supported by carefully conducted analyses. In particular, the report examines the problematic issue of drawing causal inferences from ILSAs and highlights emerging analytical approaches to ILSA data that “come close” to supporting causal inferences for policy guidance. In addition, the report also discusses the issue that ILSA data would be more useful and accurate if augmented with data from other sources, such as U.S. Census data or administrative data collected by school districts. Moreover, longitudinal studies employed in other fields serve as promising examples for making policy relevant inferences and are an important area for future research and development.
The report also finds that media reporting of ILSA results tends to be superficial and, in many cases, misleading, and that the organizations that administer ILSAs and release results would be wise to devote greater resources to preparing reporters and providing more guidance on what can and cannot be inferred from results. A particular issue of concern relates to reporting results primarily at the country level, which obscures variation within countries given that states and other smaller jurisdictions often have different social and economic characteristics and varying educational policies. Finally, to further explore these potential areas for improvement, an impartial, national entity could be created and charged with providing ongoing guidance on ILSA design, analysis, reporting, and interpretation.
The report was edited by Judith Singer, Harvard University; Henry Braun, Boston College; and Naomi Chudowsky, National Academy of Education. Additional committee members included: Anna Katyn Chmielewski, OISE/University of Toronto; Richard Durán, University of California, Santa Barbara; David Kaplan, University of Wisconsin-Madison; Marshall “Mike” Smith, Carnegie Foundation for Advancement of Teaching; and Judith Torney-Purta, University of Maryland.
As Judith Singer notes, “the steering committee was guided by the simple central question: What do the results of ILSAs really tell us about the strengths and the weaknesses of a nation’s education system? We hope that this report will serve as a springboard for greater attention from the broader research community and other stakeholders, as well as be useful to those who report on educational policy, whether on blogs or in the major media.”
For more information on the report, including background papers, videos, and panel summaries, please visit http://naeducation.org/methods-and-policy-uses-of-international-large-scale-assessments/. The full report is available on our website at this link: http://naeducation.org/wp-content/uploads/2018/04/International-Educational-Assessment-NAEd-report.pdf.
The National Academy of Education (NAEd) advances high-quality research to improve education policy and practice. Founded in 1965, the NAEd consists of members who are elected on the basis of outstanding scholarship related to education. The NAEd undertakes research studies to address pressing issues in education and administers professional development programs to enhance the preparation of the next generation of education scholars.
The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305U150003 to the National Academy of Education. The opinions expressed are those of the authors and do not represent the views of the Institute or the U.S. Department of Education.