Comparison of the Accuracy of Item Response Theory Models in Estimating Student’s Ability

Ilham Falani(1*), Makruf Akbar(2), Dali Santun Naga(3),

(1) Universitas Negeri Jakarta
(2) Universitas Negeri Jakarta
(3) Universitas Negeri Jakarta
(*) Corresponding Author




DOI: https://doi.org/10.26858/est.v6i2.13295

Abstract


This study aims to determine the item response theory model which is more accurate in estimating students' mathematical abilities. The models compared in this study are Multiple Choice Model and Three-Parameter Logistic Model. Data used in this study are the responses of a mathematical test of 1704 eighth-grade junior high school students from six schools in the Depok City, West Java. The Sampling is done by using a purposive random sampling technique. The mathematics test used for research data collection consisted of 30 multiple choice format items. After the data is obtained, Research hypotheses were tested using the variance test method (F-test) to find out which model is more accurate in estimating ability parameters. The results showed that Fvalue is obtained 1.089, and  Ftable is 1.087, the value of Fvalue > Ftable, so it concluded that Ho rejected. That means Multiple Choice Model is more accurate than Three-Parameter Logistic Model in estimating the parameters of students' mathematical abilities. This makes the Multiple-Choice Model a recommended model for estimating mathematical ability in MC item format tests, especially in the field of mathematics and other fields that have similar characteristics.

Keywords


Ability estimation, accuracy, item response theory

Full Text:

PDF

References


Abadyo. (2014). Estimasi Parameter Kemampuan dan Butir Tes Matematika dengan Menggunakan Kombinasi 3PLM/GRM dan MCM/GPCM. Universitas Negeri Yogyakarta.

Abadyo, & Bastari. (2015). Estimation of Ability and Item Parameters in Mathematics Testing By Using The Combination of 3PLM/GRM and MCM/GPCM Scoring Model. Research and Evaluation in Education Journal, 1(1), 55–72.

Almquist, B., Ashir, S., & Brännström, L. (2020). A guide to quantitative methods. Sweden: Stockholm University.

An, X., & Yung, Y. (2014). Item Response Theory : What It Is and How You Can Use the IRT Procedure to Apply It. USA: SAS Institute.

Baker, F. B., & Kim, S. (2017). The Basics of Item Response Theory Using R. New York: Springer International Publishing.

Bastari. (2015). Comparison of IRT Models that Handle Dichotomous and Polytomous Response Data Simultaneously. Unpublished Paper Work.

Borsboom, D. (2017). Educational Measurement Structural Equation Modeling A Multidisicplinary Journal. Vol 4. October https://doi.org/10.1080/10705510903206097

Linden, W. J. Van Der, & R. Hambleton (2016). Handbook of Item Response Theory (Vol. 2). New York: Springer.

Creswell, J. W., & Lecompte, M. D. (2012). Handbook on Measurement, Assessment, and Evaluation inHigher Education. Taylor & Francis The. New York: Taylor & Francis The.

De Ayala, R. J. (2013). The theory and practice of item response theory. New York: Guilford Publications.

Demars, C. E. (2010). Item Response Theory : Understanding Statistics Measurement. Oxford University Press. https://doi.org/10.15713/ins.mmj.3

Grunert, M. L., Raker, R., Murphy, K. L., & Holme, T. A. (2013). Polytomous versus Dichotomous Scoring on Multiple-Choice Examinations: Development of a Rubric for Rating Partial Credit. Chemimcal Education.

Ioannis, K. (2017). Calibration and Validation of Instruments measuring Academic Ability in Physics using Item Response Theory. University of Ioannina.

Kadir. (2017). Statistika Terapan. (5, Ed.) (3rd ed.). Jakarta: Rajawali Press.

Kastner, M., & Stangl, B. (2011). Multiple Choice and Constructed Response Tests : Do Test Format and Scoring Matter ? Procedia - Social and Behavioral Sciences, 12, 263–273. https://doi.org/10.1016/j.sbspro.2011.02.035

Kim, J., Madison, W., Hanson, B. A., & Mcgraw-hill, C. T. B. (2012). Test Equating Under the Multiple-Choice Model. Applied Psychological Measurement, 26(3), 255–270.

Naga, D. S. (2012). Teori Sekor pada Pengukuran Mental. Jakarta: PT Nagarani Citrayasa.

Nering, M. L., & Ostini, R. (2011). Handbook of Polytomous Item Response Theory Models. New York: Taylor & Francis.

Nurcahyo, F. A. (2016). Aplikasi IRT dalam Analisis Aitem Tes Kognitif, 24(2), 64–75. https://doi.org/10.22146/buletinpsikologi.25218

Price, L. R. (2017). Psychometric Methods Theory into Practice. New York: The Guilford Press.

Retnawati, H. (2011). Mengestimasi Kemampuan Peserta Tes Uraian Matematika Dengan Penskoran Politomus, Prosiding Semnas Matematika UNY, Mei 2011. 53–62.

Retnawati, H. (2014). Teori Respon Butir dan Penerapannya (Untuk Peneliti, Praktisi Pengukuran dan Pengujian, Mahasiswa Pascasarjana) (1st ed.). Yogyakarta: Mulia Medika. Retrieved from http://staff.uny.ac.id/sites/default/files/pendidikan/heri-retnawati-dr/teori-respons-butir-dan-penerapanya-135hal.pdf

Sudaryono. (2011). Implementasi Teori Responsi Butir (Item Response Theory) Pada Penilaian Hasil Belajar Akhir di Sekolah. Jurnal Pendidikan Dan Kebudayaan, 17(6), 719. https://doi.org/10.24832/jpnk.v17i6.62

Suh, Y., & Bolt, D. (2010a). Nested Logit Models for Multiple-Choice Item Response Data. Psychometrika, 75(3), 454–473. https://doi.org/10.1007/s11336-010-9163-7

Suh, Y., & Bolt, D. M. (2010b). Nested Logit Models for Multiple-Choice Item Response Data. Psychometrika, (C), 454–473.

Thissen, D., & Steinberg, L. (1984). A response model for multiple choice items. Psychometrika, 49(4), 501–519. https://doi.org/10.1007/BF02302588

Toit, M. du. (2013). IRT from SSI : Bilog-MG, Multilog, Parscale, Testfact. United States of America: Scientific Software International, Inc.

Van Der Linden, W. J. (2010). Item response theory. International Encyclopedia of Education, 4, 81–88. https://doi.org/10.1016/B978-0-08-044894-7.00250-5

Yılmaz, H. B. (2019). A Comparison of IRT Model Combinations for Assessing Fit in a Mixed Format Elementary School Science Test. International Electronic Journal of Elementary Education, 11(5), 539–545. https://doi.org/10.26822/iejee.2019553350

Zanon, C., Hutz, C. S., Yoo, H. H., & Hambleton, R. K. (2016). An application of item response theory to psychological test development. Psicologia: Reflexão e Crítica. https://doi.org/10.1186/s41155-016-0040-x


Article Metrics

Abstract view : 178 times | PDF view : 0 times

Refbacks

  • There are currently no refbacks.


Copyright (c) 2020 Ilham Falani

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Editorial Office

Journal of Educational Science and Technology
Graduate Program Universitas Negeri Makassar

   

address icon red

 Jalan Bonto Langkasa Gunungsari Baru Makassar, 90222 Kampus PPs UNM Makassar Gedung AD Ruang 406 Lt 4, Indonesia  
  jurnalestunm@gmail.com | est.journal@unm.ac.id 
  https://ojs.unm.ac.id/JEST/index 
   085299898201 (WA) 
 

EST Index by: