Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Combining propensity scores and common items for test score equating
Umeå University, Faculty of Social Sciences, Department of applied educational science.ORCID iD: 0000-0001-7282-5384
School of Mathematical Sciences, Lancaster University, United Kingdom.
Umeå University, Faculty of Social Sciences, Umeå School of Business and Economics (USBE), Statistics.ORCID iD: 0000-0001-5549-8262
2025 (English)In: Applied psychological measurement, ISSN 0146-6216, E-ISSN 1552-3497, article id 01466216251363240Article in journal (Refereed) Epub ahead of print
Abstract [en]

Ensuring that test scores are fair and comparable across different test forms and different test groups is a significant statistical challenge in educational testing. Methods to achieve score comparability, a process known as test score equating, often rely on including common test items or assuming that test taker groups are similar in key characteristics. This study explores a novel approach that combines propensity scores, based on test takers' background covariates, with information from common items using kernel smoothing techniques for binary-scored test items. An empirical analysis using data from a high-stakes college admissions test evaluates the standard errors and differences in adjusted test scores. A simulation study examines the impact of factors such as the number of test takers, the number of common items, and the correlation between covariates and test scores on the method’s performance. The findings demonstrate that integrating propensity scores with common item information reduces standard errors and bias more effectively than using either source alone. This suggests that balancing the groups on the test-takers' covariates enhance the fairness and accuracy of test score comparisons across different groups. The proposed method highlights the benefits of considering all the collected data to improve score comparability.

Place, publisher, year, edition, pages
Sage Publications, 2025. article id 01466216251363240
Keywords [en]
academic admission, educational testing, equating, fairness, nonequivalent groups with anchor test design
National Category
Probability Theory and Statistics Psychology (Excluding Applied Psychology)
Identifiers
URN: urn:nbn:se:umu:diva-243387DOI: 10.1177/01466216251363240ISI: 001539586300001PubMedID: 40757034Scopus ID: 2-s2.0-105012775902OAI: oai:DiVA.org:umu-243387DiVA, id: diva2:1990597
Funder
Marianne and Marcus Wallenberg Foundation, 2019.0129Available from: 2025-08-20 Created: 2025-08-20 Last updated: 2025-08-20

Open Access in DiVA

fulltext(2936 kB)79 downloads
File information
File name FULLTEXT01.pdfFile size 2936 kBChecksum SHA-512
80f11b532258edc83cfdbec14ab776a75be098ab411af0f7981b3074cfeb6ccd1e8677f9f2287b51c229bfae7906d25949efec44fb64876424f0385348cf2d91
Type fulltextMimetype application/pdf

Other links

Publisher's full textPubMedScopus

Authority records

Laukaityte, IngaWiberg, Marie

Search in DiVA

By author/editor
Laukaityte, IngaWiberg, Marie
By organisation
Department of applied educational scienceStatistics
In the same journal
Applied psychological measurement
Probability Theory and StatisticsPsychology (Excluding Applied Psychology)

Search outside of DiVA

GoogleGoogle Scholar
Total: 79 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 886 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf