Institute of Education

Research & Expertise to Make a Difference in Education & Beyond

Fourth Summer School on Test Development Held at IOE

In late July 2017, IOE held its fourth international summer school Test Development in Psychology and Education: Theory and Practice. Organized by the IOE Center for Monitoring the Quality in Education, the School aimed to provide a dynamic venue for learning about the latest advances in psychometric research and assessment design.

In an environment of ongoing multidisciplinary innovation and deep societal change, educational systems are rapidly transforming to better meet 21st-century human capital and economic growth objectives, this posing a host of new imperatives for today’s education measurement and quality monitoring corps.

What is probably the most challenging of all is to ensure testing community members are consistently guided by best-practice professional standards in every aspect of their research and development. This is crucial for keeping evaluation frameworks up to date and capable of producing the most representative and unbiased evidence of how students are performing – a major factor in fostering robust educational settings that will empower life-paths of greater equity and inclusion.

Held on an annual basis since 2014, IOE’s International Test Development Schools have over the years evolved into a prime academic forum to comprehensively address the above psychometric R&D agendas by fusing the broad expertise of the world’s most distinguished field experts and the fresh vision of young scholars in a series of multi-format sessions of intensive theoretical and hands-on learning. Hosted during July 23–29 at HSE’s Voronovo training center, Moscow region, this year’s Summer School has brought together prominent Dutch, American and Russian psychometric experts who were giving instruction to more than 30 international students of various level and background.

To ensure the best academic input for every enrollee, the School's one-week agenda was structured into several tracks. A basic introductory part, which was intended for all participants, aimed to give a high-level overview of today’s frontline concepts in test development – essential groundwork relevant for any evaluation professional. Subsequently, the School continued as two specialist tracks, where the faculty gave in-depth coverage of cutting-edge test design frameworks, best-practice techniques in performance-based assessment, and test quality evaluation systems.

Evidence-centered Design

The School’s Russian-language track was devoted to the Evidence-centered Design (ECD), an innovative test development system that is well poised to evolve into a global standard for 21st-century education evaluation. The track was delivered by the World Bank expert Dr. Mark Zelman, an internationally-renowned authority on education measurement, and IOE researchers Ekaterina Orel and Irina Brun.

ECD is a comprehensive assessment design methodology requiring that each stage in testing tool elaboration be appropriately validated through evidentiary argument. While involving a highly resource-intensive development process, ECD allows producing the most accurate and smart assessments that can apply even when measuring complex constructs, such as 21st-century skills. For the students to learn firsthand about key challenges and benefits of the Evidence-centered Design methodology, Ekaterina Orel and Irina Brun gave a special workshop featuring IOE’s recently commenced project to design an ECD-based tool for assessing schoolchildren’s attainment in creativity, critical thinking, communication and cooperation.        

Incidentally, the upcoming academic year of 2017/18 is set to mark a further expansion in IOE’s test design & assessment agenda, as Dr. Zelman has recently agreed to take the helm at IOE’s newly created Laboratory for Modern Testing Techniques.

Mark ZELMAN, PhD, Education Assessment and Monitoring Expert, the World Bank

What’s psychometrics all about? Well, if you come to think figuratively, you might compare it – in a tongue-in-cheeky way – with fortunetelling, as psychometrics also involves interpreting an individual’s personal data for a glimpse of their future, though certainly we can only be wishing we’d become that insightful and shrewd to make the most exact forecasts.

In essence, psychometrics and assessment have been sourcing data, analytical approaches and development methods from a string of knowledge areas. The Evidence-centered Design, for example, embeds much of the philosophical proposition by Stephen Toulmin, while also embracing ideas from architectural design, computer science and psychology.       

When it comes to ECD, one of the most important points is that this framework is incompatible with conventional assessment construction theories, so, for the ECD methodology to be properly factored in, testing tools need to be designed from scratch. Involving a highly complex development process, where the preliminary scoping and planning stage alone may take several months or even a year, fully-fledged ECD-based tools have so far been implemented in the United States and Russia only.   

Test Quality Evaluation

The School’s English-language track included two courses of study, each focusing on a specific area of test design and evaluation.

It was Dr. Bas Hemker of the CITO Dutch National Institute for Educational Measurement who was giving instruction in the first course of study, Test Quality Evaluation. Alongside his research role with CITO, Dr. Hemker has 10-years’ experience as a leading expert at the COTAN Dutch Committee on Test and Testing, where he is engaged in enhancing mechanisms for test reviewing and project-based advisory on improving psychometric design. A vanguard professional well-versed in both the developmental and quality-evaluation aspects of educational assessments, Dr. Hemker has shared diversified and in-depth insights about the most advanced methods for testing the feasibility and validity of newly created assessment materials.

Bas HEMKER, PhD, Senior Researcher and Large-scale Measurement Team Lead, CITO Dutch National Institute for Educational Measurement

Improving the quality of assessment tools is certainly among the questions that top psychometric agendas, as this is something bearing high relevance for all the stakeholders in education and the economy at large.

Having poor-quality testing materials immediately leads to misinformed decisions at all levels. For students, this may mean they are not being taught what they need, and at certain points they themselves may be having only a meager idea of what they’re actually good at and what must be improved, as we obtain and communicate inaccurate representations of their ability. At the same time, instructors may fall into a delusion of teaching the right things while they’re actually not. Finally, as inappropriate testing makes up a distorted picture of learning outcomes and outlooks, strategists may be misled about what actually needs to be prioritized in drafting policy upgrades.

So, it’s absolutely each and every party that benefits from advanced psychometric developments for higher-quality testing. First of all, children receiving the best possible education as a result of proper assessment would mean we consistently receive the greatest human capital returns. At the same time, that’s also good for teachers as they get more motivated in their role when working with highly engaged, goal-driven and high-performing students. And certainly, in the end, all of these factors add up in fostering development momentum for the country.

Performance Assessments

As part of the second course of study in English, Dr. Carol Myford, an Associate Professor at the College of Education, University of Illinois at Chicago, focused on today’s leading-edge techniques in constructing performance-based assessments. These are used in evaluating a broad range of academic and corporate training assignments, such as applied project work and creative tasks, public presentations, essay writing, etc.

Given today’s actively reframing industry profiles and labor market, with a clear call for greater efficiency and productivity, employers increasingly seek more eloquent evidence about talent’s general ability and specialist qualification. To make certain they prepare competent and competitive professionals, educational institutions need to get more flexible and visionary in handling data on learning outcomes, so as to best respond to changes in market expectations by promptly tailoring their syllabus and instructional practices. This is where formative performance appraisals can help as a means to derive an accurate and representative snapshot of what has been going the right way and what must be improved in the learning process to maximize instructional inputs and educational results.  

 

Carol MYFORD, PhD, Associate Professor, College of Education, University of Illinois at Chicago      

The message we should be sending both teachers and parents is that it’s important to strike the right balance between different types of assessment. Sure, there comes a point when we need to stop and evaluate the learning progress attained over a particular timespan, and the conventional summative five-grade assessment system may well apply in this case. But up to that, it’s crucial to make sure the students are getting on-the-go feedback about how they are performing. And in this latter case, formative assessments become an indispensable tool of the teacher’s trade that helps best realize the real worth and merit of one’s instructional inputs, and to revise approaches where necessary.

Of the people I’ve worked with in the last couple of years, I guess those who’ve been most open to what I teach about formative performance assessments are instructors at vocational education and training institutions. I believe this is because the context of vocational education involves great loads of skill development work, and there’s a strong need to ensure vocational grads are best prepared to join the workforce. So, I’ve found among vocational leaders and staff a very pronounced desire to create advanced assessment frameworks that would speak to employers.