Imperfect Storm: Deconstructing the hue and cry that follows international education assessments
It happens every 12 years: results from the world’s two largest international education assessments, TIMSS (conducted every four years) and PISA (conducted every three years) are released concurrently, setting off new rounds of soul-searching among educators, policymakers, economists and the general public in nations around the world.
This year is one of those years, with the results of the 2015 TIMSS (Trends in International Mathematics and Science Study) released on November 28th and those of PISA (Programme for International Student Assessment) released on December 5th. And in the United States, as elsewhere, depending on who you talk to, you’ll hear that the education system is falling apart, making strides, going nowhere or functioning at different levels on behalf of different populations of students. (Click here to read a story on the TIMSS and PISA results.)
“My students have coined the term ‘ranking storm,’” says Oren Pizmony-Levy, Assistant Professor of International & Comparative Education. “And this year we have two storms coming together. The question is: how will the public react?”
To answer that question, Pizmony-Levy and some 70 volunteers – his own students at TC and others abroad – have embarked on two parallel projects: an analysis of media reporting of the TIMSS and PISA results in 30 countries, and public opinion surveys in the same countries both before and after the results are reported. The latter effort is aimed at determining changes in attitudes towards nations’ education systems and policies. The project is being supported by a grant from TC’s Provost’s Investment Fund, while related work by Pizmony-Levy that looks into how policy makers in Massachusetts and North Carolina engage with PISA results is funded by the Spencer Foundation.
“Looking only at the ranking isn’t the best way to understand the data – but the general public isn’t going to read these thick 500-page reports full of charts and tables. Most will read about the results through newspapers and other media outlets. So we’re looking at how media outlets are reporting on this.”
“In many countries, there’s an obsession with the ranking – not necessarily with the achievement of the students or the gaps between groups,” Pizmony-Levy says. “Yet a nation’s ranking can go down even when its performance improves, because other countries may improve even more, or because more nations may participate in the assessment. So looking only at the ranking isn’t the best way to understand the data – but the general public isn’t going to read these thick 500-page reports full of charts and tables. Most will read about the results through newspapers and other media outlets. So we’re looking at how media outlets are reporting on this.”
In a pilot study, conducted in 20 counties based on the release of PISA in December 2013, Pizmony-Levy found that “the majority of the news stories (77.6 percent) reported on ranking, which is not the best way to engage the data.” He adds that “66 percent of the stories did not include sufficient background information about PISA. Thus the general public cannot fully understand what PISA can and cannot say about schools, teaching and learning.”
Even simple graphic presentation of rankings can color public opinion, Pizmony-Levy says. He cites one newspaper’s reporting of Israel’s performance on the TIMSS assessment during the late 1990s. “It showed that among 20 countries, Israel was 17th. But it omitted 15 other countries that were lower than Israel. And that’s a very common practice that dramatizes the results.”
One major issue, Pizmony-Levy says, is that the two assessments ostensibly cover some of the same ground – yet in fact, they are very different and often tell very different stories.
TIMSS, conducted every four years since 1995, is a grade-based assessment of fourth and eighth graders’ mastery of curriculum. PISA is an aged-based assessment of 15-year-olds that focuses on students’ ability to apply knowledge in supposed real-world situations. Both TIMSS and PISA assess math and science, but only PISA assesses reading. TIMSS incorporates more “system variables” -- contextual information about curriculum, principals, and teachers.
“In 2004, when both studies last came together, there was a lot of noise, because the rankings weren’t necessarily similar between the two tests,” Pizmony-Levy says. “That created a lot of confusion with policymakers.”
Pizmony-Levy’s project is led by doctoral student Phoebe Linh Doan, who is writing her dissertation on student’s engagement with large-scale assessment, and two master’s degree students, Erika Kessler and Jonathan Carmona. All three are students in TC International & Comparative Education program. They, in turn, are managing a team of 70 student volunteers from across TC and in Europe, South America, Australia and Hong Kong. Students in each country are asked to collect seven articles about each assessment and analyze the reporting according to a framework developed by the principle investigators. (Currently the project is focusing only on print media, but will expand to include social media channels.) The analysis includes the kinds of visual representations used, the tone of the article – whether it is “scandalizing” or “glorifying” the assessment results, or taking a neutral stance – and who is being quoted as the main interpretive voice. This framework is based on Pizmony-Levy’s research and the work of Gita Steiner-Khamsi, Professor of International & Comparative Education. (Click here to read an article by Pizmony-Levy in CU Academic Commons. Click here to read work by TC adjunct faculty member and alumna Nancy Green Saraisky in CICE.)
Training of the student analysts is extensive. Many at TC are taking, or have taken, Pizmony-Levy’s course, “Social Analysis of International Assessments,” which includes discussion of the methodology and the reports of TIMSS and PISA. The class includes hands-on exercise with media analysis.
Those not taking the class are offered training sessions that include an introduction to the project and review of our pilot study, guidance on finding relevant media outlets and news stories, review of the web-based protocol and practice using it with a news story about PISA 2013 or TIMSS 2012.
“The top five performing nations or systems are mentioned all the time, the rest, hardly ever. Yet the successes of those countries might not be replicable elsewhere, either because of the socioeconomic makeup of their populations or because they are generous welfare states. Meanwhile others tend to see themselves negatively, and politicians make use of that.”
Students have also translated the survey into 13 different languages and disseminate it via social media and listservs. The baseline survey has included nearly 3,000 respondents from 20 different countries.
Pizmony-Levy will subsequently offer training in analysis of the entire data set on media reporting, as well as in quantitative analysis of the data from the survey. Some students will use the data for their master’s degree integrative papers and projects. Other students will present work at an upcoming conference of the Comparative and International Education Society.
The accompanying public opinion survey will be conducted online during the next several months, with the results to be made available sometime this coming spring. Down the road, Pizmony-Levy hopes to study how both media portrayal of the assessments’ results not only shapes public opinion but also how it affects policy.
“The top five performing nations or systems are mentioned all the time, the rest, hardly ever,” he says. “Yet the successes of those countries might not be replicable elsewhere, either because of the socioeconomic makeup of their populations or because they are generous welfare states. Meanwhile others tend to see themselves negatively, and politicians make use of that. President-Elect Donald Trump and Designated Education Secretary Betsy DeVos might spin TIMSS and PISA results for ideas about demolishing the Department of Education or reallocating budgets. It could be used as a Sputnik moment, to evoke national security, to mobilize everyone to a cause. Without a nuanced understanding, you can project whatever you want onto the results.” – Joe Levine
What can we learn from the results of TIMSS 2015? Commentary by Oren Pizmony-Levy:
- TIMSS 2015 show us – again – that the home educational resources matter. The average score for students with many resources (e.g., more than 100 books at home, own room and access to internet, and at least one parent with a university degree) is 567, whereas the average score for students with few resources is 469. In other words, the U.S. education system is suffering from high socio-economic inequality.
- Both TIMSS and PISA collect a large volume of data, which sometime is not being used in official reports. We – faculty and students – need to further investigate the data and pose new research questions.
- It’s important to remember that TIMSS and PISA do not capture the whole process of teaching and learning in schools. Further, they provide only a snapshot of student achievement in specific subjects and cognitive domains. Other countries participate in international assessments of civic and citizenship education, whereas the U.S. decided not to take part in these assessments. Thus, we know little about how well young Americans are prepared to engage with different social issues, including women's political rights and rights of racial/ethnic groups and immigrants.
The views expressed in the previous article are solely those of the speakers to whom they are attributed. They do not necessarily reflect the views of the faculty, administration, or staff either of Teachers College or of Columbia University.
Published Wednesday, Dec 7, 2016