The Characteristics of Test Based on Nature of Science (NOS) in Newton Law Using Rasch Model Analysis

Pratiwi Restu Murti, Nonoh Siti Aminah, Harjana Harjana

Abstract


Nature of Science (NOS)-based instruments are used to measure high school students' science literacy on Newton Law subject. NOS-based instruments consist of 25 multiple choice tests with 4 alternative answers modified from Nature of Science Literacy Test (NOSLiT). The objective of this research is to find out the characteristic of each test item with NOS-based instruments which is analysed based on validity and reliability level using Rasch model (RM). The data collecting technique for this research was conducted by doing a test using NOS-based instruments. This research is categorized as descriptive with quantitative using statistic from RM through QUEST program. There are 104 students participated as the subject of this research. The result, which is based on validity of each test item using RM analysis, shows that 25 test items are considered fit or accepted. Based on the estimate of item reliability, NOS-based instrument has reliability coefficiency at 0,96. Based on difficulty items 2 test items declared not good, namely test item number 19 and item test number 4. Based on the analysis according to Classic Test Theory of RM, the NOS instrument can be used to measure senior high school students' science literacy on Newton Law.


Keywords

rasch, RM, validity, reliability

DOI: http://dx.doi.org/10.15548/jt.v26i3.493 Abstract view : 6 times

References

C. J. Wenning, “Assessing nature-of-science literacy as one component of scientific literacy,” J. Phys. Teach. Educ. Online, vol. 3, pp. 3–14, 2006.

N. G. Lederman, “CHAPTER 28 Nature of Science : Past , Present , and Future,” pp. 831–880, 1993.

J. Holbrook and M. Rannikmae, “The meaning of scientific literacy,” Int. J. Environ. Sci. Educ., vol. 4, no. 3, pp. 275–288, 2009.

I. Neumann, K. Neumann, and R. Nehm, “Evaluating instrument quality in science education: Rasch-based analyses of a nature of science test,” Int. J. Sci. Educ., vol. 33, no. 10, pp. 1373–1405, 2011.

C. J. Wenning, “Nature of Science Literacy Test.” 2007.

E. Istiyono, D. Mardapi, Suparno, “Pengenbangan Tes Kemampuan Berpikir Tinggi Fisika (PysTHOTS) Peserta Didik SMA,” Jurnal Penelitian dan Evaluasi Pendidikan., vol. 18, no. 1, pp. 1-12, 2014.

G. Van De Watering and J. Van Der Rijt, “Teachers ’ and students ’ perceptions of assessments : A review and a study into the ability and accuracy of estimating the difficulty levels of assessment items,” vol. 1, pp. 133–147, 2006.

S. C. Yang, M. Y. Tsou, E. T. Chen, K. H. Chan, and K. Y. Chang, “Statistical item analysis of the examination in anesthesiology for medical students using the Rasch model,” J. Chinese Med. Assoc., vol. 74, no. 3, pp. 125–129, 2011.

D. M. Meads and R. P. Bentall, “Rasch analysis and item reduction of the hypomanic personality scale,” Pers. Individ. Dif., vol. 44, no. 8, pp. 1772–1783, 2008.

R. M. Yasin, F. A. N. Yunus, R. C. Rus, A. Ahmad, and M. B. Rahim, “Validity and Reliability Learning Transfer Item Using Rasch Measurement Model,” Procedia - Soc. Behav. Sci., vol. 204, no. November 2014, pp. 212–217, 2015.

A. Hasmy, “Compare Unidimensional and Multidimensional Rasch Model for Test with Multidimensional Construct and Items Local Dependence,” Journal of Education and Learning. vol. 8, pp. 187–194, 2014.

Azwar, S. (1999). Dasar-Dasar Psikometri. Yogyakarta: Pustaka Belajar.

L. S. Feldt, & R. L. Brennan, “Reliability. In R. L. Linn (Ed.),” Educational measurement.,pp. 105–146. 1989., Washington, DC: American Council on Education.

H. Abdullah, N. Arsad, F. H. Hashim, N. A. Aziz, N. Amin, and S. H. Ali, “Evaluation of Students’ Achievement in the Final Exam Questions for Microelectronic (KKKL3054) using the Rasch Model,” Procedia - Soc. Behav. Sci., vol. 60, no. c, pp. 119–123, 2012.

Bond TG, Fox CM. Applying the Rasch model: fundamental measurement in the human sciences. 2nd ed. Mahwah, NJ: Lawrence Erlbaum Associates; 2007, pp. 29e66.

Adams, R.J. & Kho, Seik-Tom. (1996). Acer quest version 2.1. Camberwell, Victoria: The Australian Council for Educational Research

Adams, R. J. & Khoo, S. T. (1996). Quest: The interactive test analysis system version 2.1. Victoria: The Australian Council for Educational Reearch.

Hambleton & Swaminathan. (1985). Item Res-ponse Theory Principles and Applications. Boston: Kluwer Nijhoff Publishing.

Raymond, A.J. & Siek-Toon, K. 1996. Quest. The Interactive Test Analysus system. The Australian Council for Educational Research.

Bereby-Meijer, Y., Meijer, J.,& Flascher, O.M. “Prospect theory analysis of guessing in multiple choice tests,” Journal of Behavioral Decision Making., vol. 15, pp. 313–327. 2002.

E. Priyambodo, “Validity and Realibility of Chemistry Systemic Multiple Choices Questions (CSMCQs),” Int. J. Eval. Res. Educ., vol. 5, no. 4, pp. 306–309, 2016.


Refbacks

  • There are currently no refbacks.


Copyright (c) 2020 Al-Ta lim Journal

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

View Stats
Al-Ta’lim Journal published by Faculty of Islamic Education and Teacher Training UIN Imam Bonjol Padang

Copyright © Al-Ta'lim Online Journal, Print ISSN 1410-7546 Online ISSN 2355-7893, All rights reserved


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.