umu.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Wiberg, Marie
Alternative names
Publications (10 of 82) Show all publications
Lindvall, J., Helenius, O. & Wiberg, M. (2018). Critical features of professional development programs: Comparing content focus and impact of two large-scale programs. Teaching and Teacher Education: An International Journal of Research and Studies, 70, 121-131
Open this publication in new window or tab >>Critical features of professional development programs: Comparing content focus and impact of two large-scale programs
2018 (English)In: Teaching and Teacher Education: An International Journal of Research and Studies, ISSN 0742-051X, E-ISSN 1879-2480, Vol. 70, p. 121-131Article in journal (Refereed) Published
Abstract [en]

By comparing two large-scale professional development programs' content and impact on student achievement, we contribute to research on critical features of high quality professional development, especially content focus. Even though the programs are conducted in the same context and are highly similar if characterized according to established research frameworks, our results suggest that they differ in their impact on student achievement. We therefore develop an analytical framework that allow us to characterize the programs' content and delivery in detail. Through this approach, we identify important differences between the programs that provide explanatory value in discussing reasons for their differing impacts.

Place, publisher, year, edition, pages
PERGAMON-ELSEVIER SCIENCE LTD, 2018
Keywords
Content focus, Student achievement, Teacher professional development
National Category
Pedagogy
Identifiers
urn:nbn:se:umu:diva-144940 (URN)10.1016/j.tate.2017.11.013 (DOI)000423641900012 ()
Available from: 2018-02-23 Created: 2018-02-23 Last updated: 2018-06-09Bibliographically approved
Leôncio, W. & Wiberg, M. (2018). Evaluating equating transformations from different frameworks. In: Marie Wiberg, Steven Culpepper, Rianne Janssen, Jorge González, Dylan Molenaar (Ed.), Quantitative Psychology: The 82nd annual meeting of the psychometric society, Zurich, Switzerland, 2017 (pp. 101-110). Cham, Switzerland: Springer
Open this publication in new window or tab >>Evaluating equating transformations from different frameworks
2018 (English)In: Quantitative Psychology: The 82nd annual meeting of the psychometric society, Zurich, Switzerland, 2017 / [ed] Marie Wiberg, Steven Culpepper, Rianne Janssen, Jorge González, Dylan Molenaar, Cham, Switzerland: Springer , 2018, p. 101-110Chapter in book (Refereed)
Abstract [en]

Test equating is used to ensure that test scores from different test forms can be used interchangeably. This paper aims to compare the statistical and computational properties from three equating frameworks: item response theory observed-score equating (IRTOSE), kernel equating and kernel IRTOSE. The real data applications suggest that IRT-based frameworks tend to providemore stable and accurate results than kernel equating. Nonetheless, kernel equating can provide satisfactory results if we can find a good model for the data, while also being much faster than the IRT-based frameworks. Our general recommendation is to try all methods and examine how much the equated scores change, always ensuring that the assumptions are met and that a good model for the data can be found.

Place, publisher, year, edition, pages
Cham, Switzerland: Springer, 2018
Series
Springer Proceedings in Mathematics & Statistics, ISSN 2194-1009, E-ISSN 2194-1017 ; 233
Keywords
Test equating, item response theory, kernel equating, observed-score equating
National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
urn:nbn:se:umu:diva-147092 (URN)10.1007/978-3-319-77249-3 (DOI)978-3-319-77248-6 (ISBN)978-3-319-77249-3 (ISBN)
Available from: 2018-04-26 Created: 2018-04-26 Last updated: 2018-06-09
Wallin, G., Häggström, J. & Wiberg, M. (2018). How to select the bandwidth in kernel equating — An evaluation of five different methods. In: Marie Wiberg, Steven Culpepper, Rianne Janssen, Jorge González, Dylan Molenaar (Ed.), Quantitative Psychology: The 82nd annual meeting of the psychometric society, Zurich, Switzerland, 2017 (pp. 91-100). Cham, Switzerland: Springer
Open this publication in new window or tab >>How to select the bandwidth in kernel equating — An evaluation of five different methods
2018 (English)In: Quantitative Psychology: The 82nd annual meeting of the psychometric society, Zurich, Switzerland, 2017 / [ed] Marie Wiberg, Steven Culpepper, Rianne Janssen, Jorge González, Dylan Molenaar, Cham, Switzerland: Springer , 2018, p. 91-100Chapter in book (Refereed)
Abstract [en]

When using kernel equating to equate two test forms, a bandwidth needs to be selected. The bandwidth parameter determines the smoothness of the continuized score distributions and has been shown to have a large effect on the kernel density estimate. There are a number of suggested criteria for selecting the bandwidth, and currently four of them have been implemented in kernel equating. In this paper, all four of the existing bandwidth selectors suggested for kernel equating are evaluated and compared against each other using real test data together with a new criterion that implements leave-one-out cross-validation. Although the bandwidth methods generally were similar in terms of equated scores, there were potentially important differences in the upper part of the score scale where critical admission decisions are typically made.

Place, publisher, year, edition, pages
Cham, Switzerland: Springer, 2018
Series
Springer Proceedings in Mathematics & Statistics, ISSN 2194-1009, E-ISSN 2194-1017 ; 233
Keywords
Kernel equating, Continuization, Bandwidth selection, Cross-validation
National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
urn:nbn:se:umu:diva-147091 (URN)10.1007/978-3-319-77249-3 (DOI)978-3-319-77248-6 (ISBN)978-3-319-77249-3 (ISBN)
Available from: 2018-04-26 Created: 2018-04-26 Last updated: 2018-06-09
Wiberg, M., Ramsay, J. O. & Li, J. (2018). Optimal Scores as an Alternative to Sum Scores. In: Marie Wiberg, Steven Culpepper, Rianne Janssen, Jorge González, & Dylan Molenaar (Ed.), Quantitative Psychology: The 82nd Annual Meetingof the Psychometric Society, Zurich,Switzerland, 2017 (pp. 1-10). Cham, Switzerland: Springer
Open this publication in new window or tab >>Optimal Scores as an Alternative to Sum Scores
2018 (English)In: Quantitative Psychology: The 82nd Annual Meetingof the Psychometric Society, Zurich,Switzerland, 2017 / [ed] Marie Wiberg, Steven Culpepper, Rianne Janssen, Jorge González, & Dylan Molenaar, Cham, Switzerland: Springer , 2018, p. 1-10Chapter in book (Refereed)
Abstract [en]

This paper discusses the use of optimal scores as an alternative to sum scores and expected sum scores when analyzing test data. Optimal scores are built on nonparametric methods and use the interaction between the test takers´ responses on each item and the impact of the corresponding items on the estimate of their performance. Both theoretical arguments for optimal score as well as arguments built upon simulation results are given. The paper claims that in order to achieve the same accuracy in terms of mean squared error and root mean squared error, an optimally scored test needs substantially fewer items than a sum scored test. The top-performing test takers and the bottom 5% test takers are by far the groups that benefit most from using optimal scores.

Place, publisher, year, edition, pages
Cham, Switzerland: Springer, 2018
Series
Springer Proceedings in Mathematics & Statistics, ISSN 2194-1009, E-ISSN 2194-1017 ; 233
Keywords
Optimal scoring, Item impact, Sum scores, Expected sum scores
National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
urn:nbn:se:umu:diva-147090 (URN)10.1007/978-3-319-77249-3 (DOI)978-3-319-77248-6 (ISBN)978-3-319-77249-3 (ISBN)
Available from: 2018-04-26 Created: 2018-04-26 Last updated: 2018-06-09
Wiberg, M., Culpepper, S., Janssen, R., González, J. & Molenaar, D. (Eds.). (2018). Quantitative Psychology: The 82nd annual meeting of the psychometric society, Zurich, Switzerland, 2017. Cham, Switzerland: Springer
Open this publication in new window or tab >>Quantitative Psychology: The 82nd annual meeting of the psychometric society, Zurich, Switzerland, 2017
Show others...
2018 (English)Conference proceedings (editor) (Refereed)
Place, publisher, year, edition, pages
Cham, Switzerland: Springer, 2018
National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
urn:nbn:se:umu:diva-147088 (URN)10.1007/978-3-319-77249-3 (DOI)978-3-319-77248-6 (ISBN)978-3-319-77249-3 (ISBN)
Available from: 2018-04-26 Created: 2018-04-26 Last updated: 2018-06-09
Sansivieri, V., Wiberg, M. & Matteucci, M. (2017). A review of test equating methods with a special focus on IRT-based approaches. Statistica, 77(4), 329-352
Open this publication in new window or tab >>A review of test equating methods with a special focus on IRT-based approaches
2017 (English)In: Statistica, ISSN 1973-2201, Vol. 77, no 4, p. 329-352Article, review/survey (Refereed) Published
Keywords
item-response theory, test equating
National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
urn:nbn:se:umu:diva-147094 (URN)
Available from: 2018-04-26 Created: 2018-04-26 Last updated: 2018-06-09Bibliographically approved
Ramsay, J. O. & Wiberg, M. (2017). A Strategy for Replacing Sum Scoring. Journal of educational and behavioral statistics, 42(3), 282-307
Open this publication in new window or tab >>A Strategy for Replacing Sum Scoring
2017 (English)In: Journal of educational and behavioral statistics, ISSN 1076-9986, E-ISSN 1935-1054, Vol. 42, no 3, p. 282-307Article in journal (Refereed) Published
Abstract [en]

This article promotes the use of modern test theory in testing situations where sum scores for binary responses are now used. It directly compares the efficiencies and biases of classical and modern test analyses and finds an improvement in the root mean squared error of ability estimates of about 5% for two designed multiple-choice tests and about 12% for a classroom test. A new parametric density function for ability estimates, the tilted scaled , is used to resolve the nonidentifiability of the univariate test theory model. Item characteristic curves (ICCs) are represented as basis function expansions of their log-odds transforms. A parameter cascading method along with roughness penalties is used to estimate the corresponding log odds of the ICCs and is demonstrated to be sufficiently computationally efficient that it can support the analysis of large data sets.

Place, publisher, year, edition, pages
Sage Publications, 2017
Keywords
parameter cascading, item characteristic curves, tilted scaled distribution, sum score distribution, performance manifold
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:umu:diva-136047 (URN)10.3102/1076998616680841 (DOI)000401153600003 ()
Available from: 2017-06-22 Created: 2017-06-22 Last updated: 2018-06-09Bibliographically approved
González, J. & Wiberg, M. (2017). Applying test equating methods using R. Springer
Open this publication in new window or tab >>Applying test equating methods using R
2017 (English)Book (Refereed)
Abstract [en]

This book describes how to use test equating methods in practice. The non-commercial software R is used throughout the book to illustrate how to perform different equating methods when scores data are collected under different data collection designs, such as equivalent groups design, single group design, counterbalanced design and non equivalent groups with anchor test design. The R packages equate, kequate and SNSequate, among others, are used to practically illustrate the different methods, while simulated and real data sets illustrate how the methods are conducted with the program R. The book covers traditional equating methods including, mean and linear equating, frequency estimation equating and chain equating, as well as modern equating methods such as kernel equating, local equating and combinations of these. It also offers chapters on observed and true score item response theory equating and discusses recent developments within the equating field. More specifically it covers the issue of including covariates within the equating process, the use of different kernels and ways of selecting bandwidths in kernel equating, and the Bayesian nonparametric estimation of equating functions. It also illustrates how to evaluate equating in practice using simulation and different equating specific measures such as the standard error of equating, percent relative error, different that matters and others.

Place, publisher, year, edition, pages
Springer, 2017. p. 196
Series
Methodology of Educational Measurement and Assessment, ISSN 2367-170X, E-ISSN 2367-1718
Keywords
Test equating using R, Equating data collection designs, Presmoothing score distributions, Polynomial log-linear models for presmoothing, Traditional equating methods, Kernel equating using R, Bandwidth selection in kernel equating, IRT equating using R, Item parameter linking, Local equating using R, IRT kernel equating, Assessment of equating, Kernel equating under the NEC design, Bayesian equating, Equating using R, R code for equating, Concurrent calibration, Fixed item parameter calibration, Comparison of equating methods, Equating with covariates
National Category
Probability Theory and Statistics
Research subject
Statistics
Identifiers
urn:nbn:se:umu:diva-132490 (URN)10.1007/978-3-319-51824-4 (DOI)978-3-319-51822-0 (ISBN)978-3-319-51824-4 (ISBN)
Available from: 2017-03-15 Created: 2017-03-15 Last updated: 2018-06-09Bibliographically approved
Wiberg, B., Sircova, A., Wiberg, M. & Carelli, M. G. (2017). Balanced time perspective: Developing empirical profile and exploring its stability over time. In: Alexsandra Kostic & Derek Chadee (Ed.), Time perspective: Theory and Practice (pp. 63-95). Palgrave Macmillan
Open this publication in new window or tab >>Balanced time perspective: Developing empirical profile and exploring its stability over time
2017 (English)In: Time perspective: Theory and Practice / [ed] Alexsandra Kostic & Derek Chadee, Palgrave Macmillan, 2017, p. 63-95Chapter in book (Refereed)
Place, publisher, year, edition, pages
Palgrave Macmillan, 2017
National Category
Applied Psychology
Research subject
Psychology
Identifiers
urn:nbn:se:umu:diva-144749 (URN)10.1057/978-1-137-60191-9 (DOI)978-1-137-60190-2 (ISBN)978-1-137-60191-9 (ISBN)
Available from: 2018-02-13 Created: 2018-02-13 Last updated: 2018-06-09
Wiberg, B., Sircova, A., Wiberg, M. & Carelli, M. G. (2017). Balanced time perspective: developing empirical profile and exploring its stability over time. In: Aleksandra Kostić, Derek Chadee (Ed.), Time perspective: theory and practice (pp. 63-95). London: Palgrave Macmillan
Open this publication in new window or tab >>Balanced time perspective: developing empirical profile and exploring its stability over time
2017 (English)In: Time perspective: theory and practice / [ed] Aleksandra Kostić, Derek Chadee, London: Palgrave Macmillan, 2017, p. 63-95Chapter in book (Refereed)
Abstract [en]

Balanced time perspective (BTP) is characterized by flexible switching between a person's past, present and future time orientations, depending on situational demands, personal resources, experiences, and social evaluations. The present study aimed to explore the psychological characteristics of people with a BTP profile and attain a deeper understanding of the BTP construct. Seven people with BTP profiles were investigated using in-depth interviews, self-report instruments, and a projective test. By testing the participants on two occasions within an 18-month interval, we investigated the stability of BTP. Analyses showed that participants were aware of the "now" and had a synchronicity between the present and the past, and also between the present and the future. Results indicated a degree of temporal stability in the BTP profile and that people's interpretations and interactions within the surrounding context of events influences their time perspectives.

Place, publisher, year, edition, pages
London: Palgrave Macmillan, 2017
Keywords
Time perspective, BTP
National Category
Psychology
Research subject
Clinical Psychology; Psychology
Identifiers
urn:nbn:se:umu:diva-142601 (URN)10.1057/978-1-137-60191-9_4 (DOI)9781137601902 (ISBN)9781137601919 (ISBN)
Projects
Time perspective
Funder
Swedish Research Council, 421-2012-650
Available from: 2017-12-05 Created: 2017-12-05 Last updated: 2018-06-09Bibliographically approved
Organisations

Search in DiVA

Show all publications