Umeå University's logo

umu.sePublications
Change search
Refine search result
1 - 6 of 6
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Lundgren, Erik
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Departement of Educational Measurement.
    Exploring and modeling response process data from PISA: inferences related to motivation and problem-solving2023Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis explores and models response process data from large-scale assessments, focusing on test-taking motivation, problem-solving strategies, and questionnaire response validity. It consists of four studies, all using data from PISA (Programme for International Student Assessment) data.

    Study I processed and clustered log-file data to create a behavioral evaluation of students' effort applied to a PISA problem-solving item, and examined the relationship between students' behavioral effort, self-reported effort, and test performance. Results show that effort invested before leaving the task unsolved was positively related to performance, while effort invested before solving the tasks was not. Low effort before leaving the task unsolved was further related to lower self-reported effort. The findings suggest that test-taking motivation could only be validly measured from efforts exerted before giving up.

    Study II used response process data to infer students' problem-solving strategies on a PISA problem-solving task, and investigated the efficiency of strategies and their relationship to PISA performance. A text classifier trained on data from a generative computational model was used to retrieve different strategies, reaching a classification accuracy of 0.72, which increased to 0.90 with item design changes. The most efficient strategies used information from the task environment to make plans. Test-takers classified as selecting actions randomly performed worse overall. The study concludes that computational modeling can inform score interpretation and item design.

    Study III investigated the relationship between motivation to answer the PISA student questionnaire and test performance. Departing from the theory of satisficing in surveys a Bayesian finite mixture model was developed to assess questionnaire-taking motivation. Results showed that overall motivation was high, but decreased toward the end. The questionnaire-taking motivation was positively related to performance, suggesting that it could be a proxy for test-taking motivation, however, reading skills may affect the estimation.

    Study IV examines the validity of composite scores assessing reading metacognition, using a Bayesian finite mixture model that jointly considers response times and sequential patterns in subitem responses. The results show that, the relatively high levels of satisficing (up to 30%) negatively biased composite scores. The study highlights the importance of considering response time data and subitem response patterns when the validity of scores from the student questionnaire.

    In conclusion, response process data from international large-scale assessments can provide valuable insights into test-takers’ motivation, problem-solving strategies, and questionnaire validity.

    Download full text (pdf)
    fulltext
    Download (pdf)
    spikblad
    Download (png)
    presentationsbild
  • 2.
    Lundgren, Erik
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Departement of Educational Measurement.
    Investigating satisficing in PISA 2018 questionnaire items by jointly modeling response times and subitem responsesManuscript (preprint) (Other academic)
  • 3.
    Lundgren, Erik
    Umeå University, Faculty of Social Sciences, Department of applied educational science.
    Latent program modeling: Inferring latent problem-solving strategies from a PISA problem-solving task2022In: Journal of Educational Data Mining, E-ISSN 2157-2100, Vol. 14, no 1, p. 46-80Article in journal (Refereed)
    Abstract [en]

    Response process data have the potential to provide a rich description of test-takers’ thinking processes. However, retrieving insights from these data presents a challenge for educational assessments and educational data mining as they are complex and not well annotated. The present study addresses this challenge by developing a computational model that simulates how different problem-solving strategies would behave while searching for a solution to a Program for International Student Assessment (PISA) 2012 problem-solving item, and uses n-gram processing of data together with a naïve Bayesian classifier to infer latent problem-solving strategies from the test-takers’ response process data. The retrieval of simulated strategies improved with increased n-gram length, reaching an accuracy of 0.72 on the original PISA task. Applying the model to generalized versions of the task showed that classification accuracy increased with problem size and the mean number of actions, reaching a classification accuracy of 0.90 for certain task versions. The strategy that was most efficient and effective in the PISA Traffic task evaluated paths based on the labeled travel time. However, in generalized versions of the task, a straight line strategy was more effective. When applying the classifier to empirical data, most test-takers were classified as using a random path strategy (46%). Test-takers classified as using the travel time strategy had the highest probability of solving the task (p̂ ≈ 1). The test-takers classified as using the random actions strategy had the lowest probability of solving the task (p̂ ≈ 0.11). The effect of (classified) strategy on general PISA problem-solving performance was overall weak, except for a negative effect for the random actions strategy (β ≈ −65, CI95% ≈ [−96, −36]). The study contributes with a novel approach to inferring latent problem-solving strategies from action sequences. The study also illustrates how simulations can provide valuable information about item design by exploring how changing item properties could affect the accuracy of inferences about unobserved problem-solving strategies.

  • 4.
    Lundgren, Erik
    et al.
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Departement of Educational Measurement.
    Eklöf, Hanna
    Umeå University, Faculty of Social Sciences, Department of applied educational science, Departement of Educational Measurement.
    Questionnaire-Taking Motivation: Using response times to assess motivation to optimize in the PISA student questionnaireManuscript (preprint) (Other academic)
  • 5.
    Lundgren, Erik
    et al.
    Umeå University, Faculty of Social Sciences, Department of applied educational science.
    Eklöf, Hanna
    Umeå University, Faculty of Social Sciences, Department of applied educational science.
    Questionnaire-taking motivation: Using response times to assess motivation to optimize on the PISA 2018 student questionnaire2023In: International Journal of Testing, ISSN 1530-5058, E-ISSN 1532-7574, Vol. 23, no 4, p. 231-256Article in journal (Refereed)
    Abstract [en]

    This study aims to assess student motivation to provide valid responses to the PISA student questionnaire. This was done by modeling response times using a three-component finite mixture model, comprising two satisficing response styles (rapid and idle) and one optimizing response style. Each participant’s motivation was operationalized as their probability of providing an optimizing response to questionnaire items. Overall, the model offered a good fit to the data. Results indicate that most responders were motivated to optimize, with a slight decline toward the end. Further, results showed a positive effect of questionnaire-taking motivation on PISA performance, suggesting a positive relationship to test-taking motivation. In conclusion, response times can be valuable indicators for assessing survey response quality and may serve as a proxy for test-taking motivation.

    Download full text (pdf)
    fulltext
  • 6.
    Lundgren, Erik
    et al.
    Umeå University, Faculty of Social Sciences, Department of applied educational science.
    Eklöf, Hanna
    Umeå University, Faculty of Social Sciences, Department of applied educational science.
    Within-item response processes as indicators of test-taking effort and motivation2020In: Educational Research and Evaluation, ISSN 1380-3611, E-ISSN 1744-4187, Vol. 26, no 5-6Article in journal (Refereed)
    Abstract [en]

    The present study used process data from a computer-based problem-solving task as indications of behavioural level of test-taking effort, and explored how behavioural item-level effort related to overall test performance and self-reported effort. Variables were extracted from raw process data and clustered. Four distinct clusters were obtained and characterised as high effort, medium effort, low effort, and planner. Regression modelling indicated that among students that failed to solve the task, level of effort invested before giving up positively predicted overall test performance. Among students that solved the task, level of effort was instead weakly negatively related to test performance. A low level of behavioural effort before giving up the task was also related to lower self-reported effort. Results suggest that effort invested before giving up provides information about test-takers’ motivation to spend effort on the test. We conclude that process data could augment existing methods of assessing test-taking effort.

    Download full text (pdf)
    fulltext
1 - 6 of 6
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf