Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Within-item response processes as indicators of test-taking effort and motivation
Umeå University, Faculty of Social Sciences, Department of applied educational science.
Umeå University, Faculty of Social Sciences, Department of applied educational science.ORCID iD: 0000-0002-4630-6123
2020 (English)In: Educational Research and Evaluation, ISSN 1380-3611, E-ISSN 1744-4187, Vol. 26, no 5-6Article in journal (Refereed) Published
Abstract [en]

The present study used process data from a computer-based problem-solving task as indications of behavioural level of test-taking effort, and explored how behavioural item-level effort related to overall test performance and self-reported effort. Variables were extracted from raw process data and clustered. Four distinct clusters were obtained and characterised as high effort, medium effort, low effort, and planner. Regression modelling indicated that among students that failed to solve the task, level of effort invested before giving up positively predicted overall test performance. Among students that solved the task, level of effort was instead weakly negatively related to test performance. A low level of behavioural effort before giving up the task was also related to lower self-reported effort. Results suggest that effort invested before giving up provides information about test-takers’ motivation to spend effort on the test. We conclude that process data could augment existing methods of assessing test-taking effort.

Place, publisher, year, edition, pages
Taylor & Francis Group, 2020. Vol. 26, no 5-6
Keywords [en]
computer-based assessment, PISA 2012, problem solving, Process data, test-taking effort, test-taking motivation
National Category
Learning
Identifiers
URN: urn:nbn:se:umu:diva-186835DOI: 10.1080/13803611.2021.1963940ISI: 000684127100001Scopus ID: 2-s2.0-85112337645OAI: oai:DiVA.org:umu-186835DiVA, id: diva2:1590752
Available from: 2021-09-03 Created: 2021-09-03 Last updated: 2023-04-19Bibliographically approved
In thesis
1. Exploring and modeling response process data from PISA: inferences related to motivation and problem-solving
Open this publication in new window or tab >>Exploring and modeling response process data from PISA: inferences related to motivation and problem-solving
2023 (English)Doctoral thesis, comprehensive summary (Other academic)
Alternative title[sv]
Modellering av responsprocessdata från PISA : inferenser relaterade till motivation och problemlösning
Abstract [en]

This thesis explores and models response process data from large-scale assessments, focusing on test-taking motivation, problem-solving strategies, and questionnaire response validity. It consists of four studies, all using data from PISA (Programme for International Student Assessment) data.

Study I processed and clustered log-file data to create a behavioral evaluation of students' effort applied to a PISA problem-solving item, and examined the relationship between students' behavioral effort, self-reported effort, and test performance. Results show that effort invested before leaving the task unsolved was positively related to performance, while effort invested before solving the tasks was not. Low effort before leaving the task unsolved was further related to lower self-reported effort. The findings suggest that test-taking motivation could only be validly measured from efforts exerted before giving up.

Study II used response process data to infer students' problem-solving strategies on a PISA problem-solving task, and investigated the efficiency of strategies and their relationship to PISA performance. A text classifier trained on data from a generative computational model was used to retrieve different strategies, reaching a classification accuracy of 0.72, which increased to 0.90 with item design changes. The most efficient strategies used information from the task environment to make plans. Test-takers classified as selecting actions randomly performed worse overall. The study concludes that computational modeling can inform score interpretation and item design.

Study III investigated the relationship between motivation to answer the PISA student questionnaire and test performance. Departing from the theory of satisficing in surveys a Bayesian finite mixture model was developed to assess questionnaire-taking motivation. Results showed that overall motivation was high, but decreased toward the end. The questionnaire-taking motivation was positively related to performance, suggesting that it could be a proxy for test-taking motivation, however, reading skills may affect the estimation.

Study IV examines the validity of composite scores assessing reading metacognition, using a Bayesian finite mixture model that jointly considers response times and sequential patterns in subitem responses. The results show that, the relatively high levels of satisficing (up to 30%) negatively biased composite scores. The study highlights the importance of considering response time data and subitem response patterns when the validity of scores from the student questionnaire.

In conclusion, response process data from international large-scale assessments can provide valuable insights into test-takers’ motivation, problem-solving strategies, and questionnaire validity.

Place, publisher, year, edition, pages
Umeå: Umeå University, 2023. p. 53
Series
Academic dissertations at the department of Educational Measurement, ISSN 1652-9650 ; 15
Keywords
response processes, large-scale assessments, motivation, problem-solving, computational modeling, Bayesian modeling
National Category
Other Social Sciences not elsewhere specified
Research subject
didactics of educational measurement
Identifiers
urn:nbn:se:umu:diva-206866 (URN)978-91-8070-058-0 (ISBN)978-91-8070-057-3 (ISBN)
Public defence
2023-05-17, Aula Biologica, Umeå, 10:00 (English)
Opponent
Supervisors
Available from: 2023-04-26 Created: 2023-04-19 Last updated: 2024-07-02Bibliographically approved

Open Access in DiVA

fulltext(3594 kB)218 downloads
File information
File name FULLTEXT02.pdfFile size 3594 kBChecksum SHA-512
25f4c4a04c472c5573775fbc9f16318caf1156db9755e902f1e910a9e8d84666bf12a007a64d0e599e14e4754410c466e4c726131bac43882019b019e520d625
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Lundgren, ErikEklöf, Hanna

Search in DiVA

By author/editor
Lundgren, ErikEklöf, Hanna
By organisation
Department of applied educational science
In the same journal
Educational Research and Evaluation
Learning

Search outside of DiVA

GoogleGoogle Scholar
Total: 281 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 362 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf