Rasch scaling procedures for informing development of a valid Fetal Surveillance Education Program multiple-choice assessment
Author
Zoanetti, N; Griffin, P; Beaves, M; Wallace, EMDate
2009-04-29Source Title
BMC MEDICAL EDUCATIONPublisher
BMCAffiliation
Melbourne Graduate School of EducationMetadata
Show full item recordDocument Type
Journal ArticleCitations
Zoanetti, N., Griffin, P., Beaves, M. & Wallace, E. M. (2009). Rasch scaling procedures for informing development of a valid Fetal Surveillance Education Program multiple-choice assessment. BMC MEDICAL EDUCATION, 9 (1), https://doi.org/10.1186/1472-6920-9-20.Access Status
Access this item via the Open Access locationOpen Access at PMC
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2685791Abstract
BACKGROUND: It is widely recognised that deficiencies in fetal surveillance practice continue to contribute significantly to the burden of adverse outcomes. This has prompted the development of evidence-based clinical practice guidelines by the Royal Australian and New Zealand College of Obstetricians and Gynaecologists and an associated Fetal Surveillance Education Program to deliver the associated learning. This article describes initial steps in the validation of a corresponding multiple-choice assessment of the relevant educational outcomes through a combination of item response modelling and expert judgement. METHODS: The Rasch item response model was employed for item and test analysis and to empirically derive the substantive interpretation of the assessment variable. This interpretation was then compared to the hierarchy of competencies specified a priori by a team of eight subject-matter experts. Classical Test Theory analyses were also conducted. RESULTS: A high level of agreement between the hypothesised and derived variable provided evidence of construct validity. Item and test indices from Rasch analysis and Classical Test Theory analysis suggested that the current test form was of moderate quality. However, the analyses made clear the required steps for establishing a valid assessment of sufficient psychometric quality. These steps included: increasing the number of items from 40 to 50 in the first instance, reviewing ineffective items, targeting new items to specific content and difficulty gaps, and formalising the assessment blueprint in light of empirical information relating item structure to item difficulty. CONCLUSION: The application of the Rasch model for criterion-referenced assessment validation with an expert stakeholder group is herein described. Recommendations for subsequent item and test construction are also outlined in this article.
Keywords
Curriculum and PedagogyExport Reference in RIS Format
Endnote
- Click on "Export Reference in RIS Format" and choose "open with... Endnote".
Refworks
- Click on "Export Reference in RIS Format". Login to Refworks, go to References => Import References