Published on in Vol 9 (2023)
Preprints (earlier versions) of this paper are
available at
https://preprints.jmir.org/preprint/44084, first published
.
![Scoring Single-Response Multiple-Choice Items: Scoping Review and Comparison of Different Scoring Methods Scoring Single-Response Multiple-Choice Items: Scoping Review and Comparison of Different Scoring Methods](https://asset.jmir.pub/assets/2887e8855973eade95f49dabc731595f.png 480w,https://asset.jmir.pub/assets/2887e8855973eade95f49dabc731595f.png 960w,https://asset.jmir.pub/assets/2887e8855973eade95f49dabc731595f.png 1920w,https://asset.jmir.pub/assets/2887e8855973eade95f49dabc731595f.png 2500w)
Journals
- Su M, Lin L, Lin L, Chen Y. Assessing question characteristic influences on ChatGPT's performance and response-explanation consistency: Insights from Taiwan's Nursing Licensing Exam. International Journal of Nursing Studies 2024;153:104717 View
- Rezigalla A, Eleragi A, Elhussein A, Alfaifi J, ALGhamdi M, Al Ameer A, Yahia A, Mohammed O, Adam M. Item analysis: the impact of distractor efficiency on the difficulty index and discrimination power of multiple-choice items. BMC Medical Education 2024;24(1) View
- Rössler L, Herrmann M, Wiegand A, Kanzow P. Usage of Multiple-Choice Items in Summative Examinations: Questionnaire Survey Among German Undergraduate Dental Training Programmes (Preprint). JMIR Medical Education 2024 View