Centre for Psychometrics and Measurement in Education

 

Psychometric studies encompass methods for measuring and assessing a variety of characteristics in individuals, including psychological traits, knowledge, competencies, skills, and more, through the application of statistical techniques.

 

Design and Adaptation

We design and adapt tools for educational and psychological assessment aimed at both children and adults. These tools evaluate constructs such as social-emotional skills and other complex attributes.

Development

We develop statistical and machine learning models designed to analyse digital footprints, facilitate automatic scoring of open-ended and essay assignments, and generate test items automatically.

Research

We conduct research of new constructs, specifically multicomponent latent characteristics of children and adults. We aim to define these constructs, identify relevant indicators, and establish measurement methods that align with contemporary educational needs.

Monitoring

We monitor educational outcomes, social-emotional skills, and various assessments for educational institutions across different levels.

Expertise and Consulting

 

Valid Instruments

We provide educational and commercial organisations with tools that comply with both Russian and international psychometric standards. Our services include thorough assessments and tailored recommendations aimed at enhancing the quality of these instruments.

 

Individual Educational Progress of Students

We monitor students’ academic literacy at the school level, as well as their professional and universal competencies at the university level, ensuring a comprehensive understanding of their educational development.

 

Professional Development

We offer professional development programmes that address various aspects of measurement within the field of education, utilising our specially designed proprietary training courses in continuing professional education.

Instruments for Modern Education

START-PROGRESS

A range of tools for measuring individual progress in basic literacy (reading, language and math) and subject-specific learning outcomes in school.

4K Test

This assessment focuses on Critical Thinking, Creativity, Communication, and Cooperation among students in primary and secondary education.

IC Literacy Test

This test measures the information and communication competencies of students in Year 9.

DIGLIT

This assessment gauges the digital literacy skills of elementary school students.

Assessment Tools for Social-Emotional Skills

A suite of instruments designed to evaluate social-emotional skills, motivation, and subjective well-being among primary and secondary school students.

Assessment in Primary Education

Reading Literacy Test

An instrument based on Item Response Theory (IRT) methods has been developed to assess the reading literacy of students in Years 2 and 3. This test evaluates comprehension of both fiction and informational texts through a unified scale and a threshold scoring system. Furthermore, the tool assesses the effectiveness of reading strategies and enables the tracking of student progress.

Commissioned by Uchi.ru

Language Literacy Test

An interactive task format has been designed to monitor the development of language literacy among students in Years 3 to 5. This tool allows students to identify and correct errors in text using a specialised browser that resembles a text editor. This methodology simulates real-world conditions, facilitating effective progress monitoring in teaching proficient writing skills.

Commissioned by Sber

Test of Critical Thinking, Creativity, Communication, and Cooperation

A method for diagnosing soft skills has been developed for students in Years 4 to 7. These skills are essential for successful adaptation in today’s world. The tool provides the flexibility to select which skills to measure—whether one, several, or all four. The 4K test establishes a baseline level of these skills and enables the measurement of progress in their development.

Commissioned by Uchi.ru

Assessment in Secondary Education

Monitoring Individual Progress

A subject knowledge monitoring tool has been developed to track the progress of students in grades 5 to 11, in accordance with the Federal State Educational Standards (FSES). Assessments are conducted several times a year in a computerised format, with all tasks automatically evaluated to ensure objectivity and expedite verification. The tool has been successfully piloted in practice, and its efficacy has been validated by independent experts.

Commissioned by the HSE University

Monitoring Information and Communication Competence

The level of information and communication (IC) competence among graduates of schools was assessed. Over 30,000 ninth-grade students from 21 regions of the Russian Federation participated in this monitoring study, alongside more than 12,000 teachers and over 1,100 school administrators. The aim of the study was to evaluate the readiness of school graduates for life in an information society and to identify factors influencing the development of IC competence across different regions of the Russian Federation.

Commissioned by the Foundation for New Forms of Education Development, Federal State Agency for Educational Development

Diagnostics of Urban Literacy

A comprehensive assessment of the readiness of sixth and tenth-grade pupils for life in a megacity was conducted. The evaluation encompassed various aspects, including urban mobility, local literacy, healthy lifestyle choices, utilisation of digital technologies, pro-social involvement, and intercultural interaction.

Commissioned by the Moscow Department of Education

Entrance Testing for the Project ‘Mediaclass in a Moscow School’

Five variants of selection tests were specifically designed for ninth-grade graduates applying to study in media classes. These tests focus on assessing language and information literacy within the context of modern media. Each variant is grounded in a theoretical model that can be employed to evaluate similar constructs.

Commissioned by the Moscow Department of Education

Development of a Conceptual Model for Digital Literacy Research and Testing of Assessment Tasks for 7th Grade Students

A tool for measuring digital literacy has been developed, featuring interactive scenario-based tasks. The context of these tasks immerses students in familiar environments and presents realistic challenges drawn from school or everyday life. More than 4,000 seventh and eighth-grade students from Moscow schools have been assessed using this tool. A demo version of the tasks is available for review.

Commissioned by the Moscow Department of Education

Career-Educational Navigation in Socio-Economic and Institutional Contexts

An examination of a battery of tests for career guidance aimed at schoolchildren has been conducted, along with proposed recommendations for improvement.

Commissioned by the Ticket to the Future Foundation

Assessment in Vocational and Higher Education

Models for Developing and Evaluating General Competencies of Graduates from Specialised Secondary Education

This section outlines the activity tasks and assessment rubrics designed to evaluate the level of universal competencies among students in secondary vocational education, preparing them for employment in a dynamic labour market.

Commissioned by the Ministry of Education of the Russian Federation

Economic Literacy Assessment

A comprehensive system of tasks has been developed to evaluate students' economic literacy, incorporating formative feedback. This tool is grounded in cognitive diagnostic models and features specific scenarios that assess students' economic reasoning and knowledge. The assessment not only measures outcomes but also analyses the decision-making processes involved.

Commissioned by the HSE University

Legal Literacy Assessment

This scenario-based assessment tool is designed for university students and focuses on resolving case studies that reflect real-life situations, with the automatic checking of results. Examples of scenarios include evaluating an employer based on an employment contract and selecting a lawyer.

Commissioned by HSE University

Testological Expertise for the ‘I Am a Professional’ Olympiad Measurement Tools

Support has been provided to test developers for the ‘I am a Professional’ Olympiad, which includes the creation of training documents and test specifications. Experts have analysed these specifications, reviewed the wording of the tests, and conducted psychometric analyses of the results. Recommendations for adjustments to improve next year’s project have been proposed.

Commissioned by the Organising Committee of the ‘I am a Professional’ Olympiad

Assessment of Adult Skills

Diagnostic Tool for Employee Motivation in Organisations

An ipsative diagnostic tool has been developed to assess the factors influencing motivation and demotivation among employees. Utilising Item Response Theory (IRT) methods, this tool provides organisations with insights into employee motivation dynamics, enabling targeted interventions to enhance workplace engagement.

Commissioned by ANO ‘Russia - the Country of Opportunities’

Methodology for Creating Tools to Measure Competences in the Digital Economy

This methodology outlines the development of assessment tools designed to measure digital competences and key skills essential for independent competence evaluation in the digital economy (IQE). It includes formulated quality requirements for these tools and established procedures for their certification within the framework of the National Qualifications System (NOC CE).

Commissioned by University of National Technological Initiative 2035

Pilot Testing of a Digital Literacy Assessment Tool

A new assessment tool has been created to evaluate the digital literacy levels of working-age adults. This tool features automatic feedback mechanisms and is based on scenario-based tasks that test competencies relevant to the digital economy. It has demonstrated high validity and reliability, making it suitable for use within the National Educational Qualification Network focused on digital competences.

Commissioned by ‘University of the National Technological Initiative 2035’

Research on Factors Influencing the Development of Key Digital Competences

An extensive analysis has been conducted on both international and Russian studies regarding the assessment of digital competences. This research employed modern assessment tools, including authentic scenario-based tasks and automated result processing, to identify statistically significant factors that affect the formation and development of basic digital competences among target groups such as students and working-age adults.

By order of the HSE University

Assessment of Critical Thinking in a Free Online Environment

A monitoring tool has been developed to evaluate adults' critical thinking abilities. Participants are required to formulate their positions based on independently sourced arguments from the online environment. The assessment is conducted automatically through machine learning technology, which scores responses. This tool can be beneficial for recruitment processes and for assessing the critical thinking development levels of students.

By order of HSE University

Research Projects

Centre for Interdisciplinary Human Potential Research

We focus on developing theoretical and conceptual models, as well as tools for the valid and reliable measurement of key skills and competences in today’s world. Our work includes conducting neurocognitive research to enhance educational practices and analysing assessment results.

Funded by the Ministry of Science and Higher Education of the Russian Federation, Grant No. 075-15-2020-928.

Utilising Contextual Information and Digital Assessment Data to Measure Individual Progress of Primary School Students through Digital Technologies

We have developed a methodology that leverages response times, logbooks, and other digital traces to enhance the validity of assessments and improve feedback on student performance.

Exploring the Dynamics of Student Heterogeneity in Primary Schools: Tracking the Movement of Students from At-Risk to Leadership Groups

By analysing Russian data from large-scale longitudinal monitoring studies in elementary schools, we assess student progress at both individual and group levels. Our research tracks group stability, potential migration of students across groups, and identifies statistically significant contextual and personal factors related to these dynamics.

Supported by a grant from the Centre for Fundamental Research at HSE University

Automating Tasks and Evaluating Open-Ended Responses Using Machine Learning

We propose machine learning models to automate the development and verification of tasks related to Russian language and reading literacy at the school level. This project aims to reduce the costs associated with creating unique and varied tasks while implementing a system for the automatic evaluation of open-ended responses in educational testing.

Funded by a grant from the Centre for Artificial Intelligence at HSE University

Researching Age-Specific Characteristics and Psychological-Pedagogical Conditions for the Formation and Development of Critical Thinking

Our research identifies and systematises the psychological and pedagogical conditions that foster the development of critical thinking, taking into account age-specific characteristics and the social context of students' development.

Supported by a grant from the Centre for Fundamental Research at HSE University

We Teach

Continuing Education

 

Continuing Professional Education Programmes

These programmes focus on foundational skills in psychometric analysis and test development, alongside advanced measurement techniques applicable in education, psychology, and other social sciences.

 

International Summer School: Applied Psychometrics in Psychology and Education

This event convenes researchers and assessment practitioners to examine contemporary theoretical and practical trends while fostering professional connections.

Our Seminars

Interdisciplinary Neuroscience of Education Workshop

The aim is to build a new scientific field focused on using basic science to improve education.

Every three weeks, hybrid format

Centre for Psychometrics and Measurement in Education seminar

Aimed at social science researchers, and based on the measurement methods in education and psychology. The seminar is attended by staff of the Centre and invited experts.

International Scientific Online Seminar Measurement and Data Analysis in Psychology and Education

Organised in partnership with the Moscow State University of Psychology and Education. We discuss issues of measurement in psychology and education, evidence-based research, and big data analysis in education.

Our Departments

 

In Memory of Mark Zelman 1961-2022

1961–2022

Mark Zelman was one of the world's leading experts in the field of educational assessment and quality monitoring. He was involved in the creation of many tests for different countries, including the Graduate Record Examination (GRE), the Graduate Management Admission Test (GMAT), the Prueba de Admisión para Estudios Graduados (PAEG), the Praxis Series, NAEP, and SAT I. Mark was an assessment expert for major corporations such as the Educational Testing Service (ETS), USAID, British Council, IADB, American Councils, and the World Bank, and chaired many expert panels.

He was fluent in six languages. Mark is a great loss to us: all these years he remained a friend, partner and advisor to the Centre for Psychometrics and Measurement in Education, speaking at our Summer Schools, teaching in our educational programmes and helping us to develop Russian instruments.

The Staff

Our specialists have been trained at the largest psychometric centres in the world: CITO, Boston College, University of Massachusetts, University of Durham, University of Berkeley, University of Illinois at Chicago.

Contact: Ksenia Tarasova
Email: ktarasova@hse.ru

Heads of the Centre

Viktor Bolotov

Academic Supervisor

Elena Kardanova

Academic Supervisor

Alina Ivanova

Senior Research Fellow, Deputy Director

Sergey Tarasov

Junior Research Fellow, Deputy Director

Svetlana Avdeeva

Laboratory Head

Inna Antipkina

Laboratory Head, Research Fellow

The Staff

Ekaterina Bakay

Research Assistant

Daria Gracheva

Junior Research Fellow

Tatjana Kanonire

Senior Research Fellow

Ekaterina Pavlova

Research Assistant

Maria Pavlova

Research Assistant

Anna Popova

Research Assistant, Doctoral Student

Alexandra Strukova

Research Assistant

Daniil Talov

Research Assistant

Elen Yusupova

Research Fellow

Publications

  • Book

    The First Year at School: An International Perspective

    This book explores an under-researched but vital part of education: the first year at primary/elementary school. The work shows that children’s progress varies enormously from school to school, class to class and child to child. This variation is important because the more progress that children make in that first year of school, the higher their academic attainment at the end of compulsory schooling.

    The iPIPS (international Performance Indicators in Primary Schools) project, upon which this book is based, has been able to provide deeper insights into some of the key issues within and across different contexts whilst highlighting new and some ongoing issues. Despite all the work there remain unanswered or new puzzling issues which are also explored.

     

    We need to know how to improve the education at that stage and, more broadly, we need greater clarity about when children should be taught to read and be introduced to formal arithmetic, in other words, when they should start school.  We also need to be clearer about whether, when and how young children should be assessed. The book will suggest some answers but it will raise important questions and dilemmas for which we do not, as yet, have answers.

    Switzerland: Springer, 2023.

  • Article

    Belyaeva A.

    HowStudentsBehaveWhileSolvingCriticalThinkingTasksinanUnconstrainedOnlineEnvironment: InsightsfromProcessMining

    To learn successfully with the use of various internet resources, students must acquire critical thinking skills that will enable them to critically search, evaluate, select and verify information online. Defined as Critical Online Reasoning, this complex latent construct manifests itself in an unconstrained online environment and is measured on two levels: students’ work product (the essay) and the process of task completion (online behaviour patterns). This research investigates the possibility of distinguishing between students’ successful and unsuccessful attempts to take the test with the implementation of process mining techniques. The findings of the work were gained on generalised behaviour patterns from the process mining algorithm deployed on the two groups of students (63 low performing and 45 high performing students). Divided by the work product score, the two groups exposed some differences in their online behaviour, with the high performers showing more strategic behaviour and effective search and use of information online. However, the research has also shown the downside of process mining as a tool for generalisation of process patterns. In the conclusion section, the author suggests future areas that could be investigated to approach the task of pattern extraction.

    Educational studies. 2024. P. 4-26.

  • Book chapter

    Ivanova A., Kuzmina Y.

    The Effects of Class Composition on First-Graders’ Mathematics and Reading Results: Two Countries’ Cases

    This section considers the issue of class composition in primary schools and its effect on pupils’ educational progress. It contains the results of two independent studies conducted in two countries with quite different contexts – Russia and the Netherlands.

    In bk.: The First Year at School: An International Perspective. Switzerland: Springer, 2023. P. 293-305.

  • Working paper

    Kardanova E., Ivanova A., Tarasova K. et al.

    A Novel Psychometrics-Based Approach to Developing Professional Competency Benchmark for Large Language Models

    The era of large language models (LLM) raises questions not only about how to train models, but also about how to evaluate them. Despite numerous existing benchmarks, insufficient attention is often given to creating assessments that test LLMs in a valid and reliable manner. To address this challenge, we accommodate the Evidence-centered design (ECD) methodology and propose a comprehensive approach to benchmark development based on rigorous psychometric principles. In this paper, we have made the first attempt to illustrate this approach by creating a new benchmark in the field of pedagogy and education, highlighting the limitations of existing benchmark development approach and taking into account the development of LLMs. We conclude that a new approach to benchmarking is required to match the growing complexity of AI applications in the educational context. We construct a novel benchmark guided by the Bloom's taxonomy and rigorously designed by a consortium of education experts trained in test development. Thus the current benchmark provides an academically robust and practical assessment tool tailored for LLMs, rather than human participants. Tested empirically on the GPT model in the Russian language, it evaluates model performance across varied task complexities, revealing critical gaps in current LLM capabilities. Our results indicate that while generative AI tools hold significant promise for education - potentially supporting tasks such as personalized tutoring, real-time feedback, and multilingual learning - their reliability as autonomous teachers' assistants right now remain rather limited, particularly in tasks requiring deeper cognitive engagement.

    Computation and Language (cs.CL); Artificial Intelligence (cs.AI). cs.CL. arXiv, 2024

All publications