umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A perfect score: Validity arguments for college admission tests
Umeå University, Faculty of Social Sciences, Department of Educational Measurement.
2009 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

College admission tests are of great importance for admissions systems in general and for candidates in particular. The SweSAT (Högskoleprovet in Swedish) has been used for college admission in Sweden for more than 30 years, and today it is alongside with the upper-secondary school GPA the most widely used instrument for selection of college applicants. Because of the importance that is placed on the SweSAT, it is essential that the scores are reliable and that the interpretations and uses of the scores are valid. The main purpose of this thesis was therefore to examine some assumptions that are of importance for the validity of the interpretation and use of SweSAT scores. The argument-based approach to validation was used as the framework for the evaluation of these assumptions.The thesis consists of four papers and an extensive introduction with summaries of the papers. The first three papers examine assumptions that are relevant for the use of SweSAT scores for admission decisions, while the fourth paper examines an assumption that is relevant for the use of SweSAT scores for providing diagnostic information. The first paper is a review of predictive validity studies that have been performed on the SweSAT. The general conclusion from the review is that the predictive validity of SweSAT scores varies greatly among study programs, and that there are many problematic issues related to the methodology of the predictive validity studies. The second paper focuses on an assumption underlying the current SweSAT equating design, namely that the groups taking different forms of the test have equal abilities. The results show that this assumption is highly problematic, and consequently a more appropriate equating design should be applied when equating SweSAT scores. The third paper examines the effect of textual item revisions on item statistics and preequating outcomes, using data from the SweSAT data sufficiency subtest. The results show that most kinds of revisions have a significant effect on both p-values and point-biserial correlations, and as a consequence the preequating outcomes are affected negatively. The fourth paper examines whether there is added value in reporting subtest scores rather than just the total score to the test-takers. Using a method derived from classical test theory, the results show that all observed subscores are better predictors of the true subscores than is the observed total score, with the exception of the Swedish reading comprehension subtest. That is, the subscores contain information that the test-takers can use for remedial studies and hence there is added value in reporting the subscores. The general conclusion from the thesis as a whole is that the interpretations and use of SweSAT scores are based on several questionable assumptions, but also that the interpretations and uses are supported by a great deal of validity evidence.

Place, publisher, year, edition, pages
Umeå: Institutionen för Beteendevetenskapliga mätningar, Umeå Universitet , 2009. , 58 p.
Series
Academic dissertations at the department of Educational Measurement, ISSN 1652-9650 ; 4
Keyword [en]
college admission tests, SweSAT, validity, interpretive arguments, predictive validity, equating, item revisions, subscores
Identifiers
URN: urn:nbn:se:umu:diva-25433OAI: oai:DiVA.org:umu-25433DiVA: diva2:231769
Distributor:
Beteendevetenskapliga mätningar, 90187, Umeå
Public defence
2009-09-25, Hörsal E, Humanisthuset, Umeå Universitet, Umeå, 13:15 (English)
Opponent
Supervisors
Available from: 2009-09-04 Created: 2009-08-17 Last updated: 2009-09-04Bibliographically approved
List of papers
1. Prediction of academic performance by means of the Swedish scholastic assessment test
Open this publication in new window or tab >>Prediction of academic performance by means of the Swedish scholastic assessment test
2008 (English)In: Scandinavian Journal of Educational Research, ISSN 0031-3831, E-ISSN 1470-1170, Vol. 52, no 6, 565-581 p.Article in journal (Refereed) Published
Abstract [en]

This article reviews ten predictive validity studies of the Swedish Scholastic Assessment Test (SweSAT). A primary result is that the predictive validity of the SweSAT seems to be highly dependent upon the study programme being examined; that is, the predictive validity is better at some programmes than others. When compared with the upper-secondary school grade point average, the predictive validity of the SweSAT seems to be fairly good, but there are major differences between study programmes in this case as well. However, it is suggested that the validity of the results is to some extent threatened by methodological issues. A general conclusion is, therefore, that there is room for improving the test itself, as well as the way that predictive validity studies are carried out.

Place, publisher, year, edition, pages
Taylor & Francis/Routledge, 2008
Keyword
admission system, higher education, prediction, validity
National Category
Pedagogy Probability Theory and Statistics Other Social Sciences not elsewhere specified
Research subject
Education; Statistics
Identifiers
urn:nbn:se:umu:diva-23596 (URN)10.1080/00313830802497158 (DOI)
Available from: 2009-06-26 Created: 2009-06-26 Last updated: 2017-12-13
2. Systematic equating error with the randomly-equivalent groups design: An examination of the equal ability distribution assumption
Open this publication in new window or tab >>Systematic equating error with the randomly-equivalent groups design: An examination of the equal ability distribution assumption
(English)Manuscript (preprint) (Other academic)
Abstract [en]

The equal ability distribution assumption associated with the randomly-equivalent groups equating design was investigated in the context of a selection test for admission to higher education. Test-takers’ scores on anchor items from two subtests were estimated using information about test-taker performance on the regular subtests. The results indicated that the anchor test item performance varied sufficiently so that the equal ability distribution assumption could be questioned. Consequently, our conclusion is that more caution when applying the randomly-equivalent groups design in the equating of tests is needed. Assuming equal ability groups is a convenient assumption to make but it also can lead to systematic bias in the equating of test scores and this study provides a demonstration of that point.

Keyword
equating error, randomly-equivalent groups design, anchor tests, college admission tests
Identifiers
urn:nbn:se:umu:diva-25430 (URN)
Available from: 2009-08-17 Created: 2009-08-17 Last updated: 2010-01-14
3. The effect of item revisions on classical item statistics and preequating outcomes
Open this publication in new window or tab >>The effect of item revisions on classical item statistics and preequating outcomes
(English)Manuscript (preprint) (Other academic)
Abstract [en]

The purpose of this study was to examine the effect of textual item revisions on classical item statistics and the adequacy of preequating outcomes. Three forms of the SweSAT data sufficiency subtest, comprising a total of 66 items, were examined. The items were categorized after type of revision, and the differences in p-values and point-biserial correlations between regular test and pretest were averaged in each category. These averages were subjected to a t-test, and it was found that revisions have a significant effect on both difficulty and discrimination indices. Also, while the preequating method used in this study produced adequate results, the revisions seem to increase the amount of error in preequating outcomes.

Keyword
achievement testing, textual revisions, item statistics, item pretesting, preequating
Identifiers
urn:nbn:se:umu:diva-25431 (URN)
Available from: 2009-08-17 Created: 2009-08-17 Last updated: 2010-01-14
4. Reporting subscores from college admission tests
Open this publication in new window or tab >>Reporting subscores from college admission tests
2009 (English)In: Practical Assessment, Research & Evaluation, ISSN 1531-7714, E-ISSN 1531-7714, Vol. 14, no 4Article in journal (Refereed) Published
Abstract [en]

The added value of reporting subscores on a college admission test (SweSAT) was examined in this study. Using a CTT-derived objective method for determining the value of reporting subscores, it was concluded that there is added value in reporting section scores (Verbal/Quantitative) as well as subtestscores. These results differ from a study of the SAT I and a study of a basic skills test and thus highlight the need for practitioners and researchers to gather empirical evidence to support the reporting ofsubscores. The cause of the disparate results seems to be related to differences in the composition ofthe tests rather than differences in the composition of the examinee groups.

National Category
Psychology Pedagogy
Identifiers
urn:nbn:se:umu:diva-25432 (URN)
Available from: 2009-08-17 Created: 2009-08-17 Last updated: 2017-12-13Bibliographically approved

Open Access in DiVA

fulltext(358 kB)1075 downloads
File information
File name FULLTEXT01.pdfFile size 358 kBChecksum SHA-512
01559964e39184a082d5ad1cc8f6d1f95c8cb2110a0834badbeafa14eaf814f03bd85731f5678b47f89e16d90d7dfcf57ad8ac0e339f7fe8b87082cb8968a539
Type fulltextMimetype application/pdf

By organisation
Department of Educational Measurement

Search outside of DiVA

GoogleGoogle Scholar
Total: 1075 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 1634 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf