Integration of item response theory in the development of PhET-based graphing lines worksheets for optimizing student algebra competence

Authors

  • Giyanti Universitas Serang Raya
  • Indri Lestari Universitas Serang Raya
  • Rina Oktaviyanthi Universitas Serang Raya

DOI:

https://doi.org/10.29408/jel.v11i1.27634

Keywords:

Algebra competence, graphing lines, item response theory, parameter logistic, phet interactive simulation

Abstract

This study develops and evaluates a graphing line worksheet based on PhET Interactive Simulation integrated with Item Response Theory (IRT) methods to enhance student algebra competence. Involving 120 students, the worksheet comprises 12 items measuring four key indicators: understanding the geometric significance of line slopes, constructing line equations, graphing from line equations, and predicting the effects of variable changes. The 2-Parameter Logistic (2PL) model of IRT was employed to analyze item difficulty and student ability in logit form. The results indicate that the worksheet is effective in improving student algebra competence, with Items 1 and 11 demonstrating a good balance between difficulty and discrimination. Item 2 requires further review because of its high difficulty, whereas Item 12 is considered too easy. Heatmap analysis and Item Characteristic Curves (ICC) revealed variations in student response patterns, confirming the test's ability to evaluate diverse levels of student ability. The integration of interactive simulation and IRT has proven to be an effective strategy in instructional design, supporting adaptive and personalized learning.

Author Biographies

Giyanti, Universitas Serang Raya

Department of Mathematics Education

Indri Lestari, Universitas Serang Raya

Department of Mathematics Education

Rina Oktaviyanthi, Universitas Serang Raya

Department of Mathematics Education

References

Ackerman, T. A., & Ma, Y. (2024). Examining differential item functioning from a multidimensional IRT perspective. Psychometrika, 89(1), 4–41. https://doi.org/ 10.1007/s11336-024-09965-6

Alam, A. (2023). Harnessing the power of AI to create intelligent tutoring systems for enhanced classroom experience and improved learning outcomes. Lecture Notes on Data Engineering and Communications Technologies, 171, 571–591. https://doi.org/10.1007/978-981-99-1767-9_42

Alhadlaq, A. (2023). Computer-based simulated learning activities: exploring saudi students’ attitude and experience of using simulations to facilitate unsupervised learning of science concepts. Applied Sciences 2023, Vol. 13, Page 4583, 13(7), 4583. https://doi.org/10.3390/app13074583

Atabas, S., Schellinger, J., Whitacre, I., Findley, K., & Hensberry, K. (2020). A tale of two sets of norms: Comparing opportunities for student agency in mathematics lessons with and without interactive simulations. The Journal of Mathematical Behavior, 58, 100761. https://doi.org/10.1016/j.jmathb.2020.100761

Azrillia, W., Oktaviyanthi, R., Khotimah, K., & Garcia, M. L. B. (2024). Feasibility test of articulate storyline 3 learning media based on local wisdom for optimizing students’ algebraic thinking skills. Paedagogia, 27(1), 1–15. https://doi.org/10.20961/paedagogia.v27I1.84145

Birgin, O., & Uzun Yazıcı, K. (2021). The effect of GeoGebra software–supported mathematics instruction on eighth-grade students’ conceptual understanding and retention. Journal of Computer Assisted Learning, 37(4), 925–939. https://doi.org/10.1111/jcal.12532

Bråting, K., & Kilhamn, C. (2021). Exploring the intersection of algebraic and computational thinking. Mathematical Thinking and Learning, 23(2), 170–185. https://doi.org/10.1080/10986065.2020.1779012

Cai, L., Chung, S. W., & Lee, T. (2023). Incremental model fit assessment in the case of categorical data: Tucker–Lewis index for item response theory modeling. Prevention Science, 24(3), 455–466. https://doi.org/10.1007/s11121-021-01253-4

Canoz, G. M., Ucar, S., & Demircioglu, T. (2022). Investigate the effect of argumentation-promoted interactive simulation applications on students’ argumentation levels, academic achievements, and entrepreneurship skills in science classes. Thinking Skills and Creativity, 45, 101106. https://doi.org/10.1016/j.tsc.2022.101106

Chan, S. W., Looi, C. K., & Sumintono, B. (2021). Assessing computational thinking abilities among Singapore secondary students: a Rasch model measurement analysis. Journal of Computers in Education, 8(2), 213–236. https://doi.org/10.1007/s40692-020-00177-2

Chimoni, M., Pitta-Pantazi, D., & Christou, C. (2023). Unfolding algebraic thinking from a cognitive perspective. Educational Studies in Mathematics, 114(1), 89–108. https://doi.org/10.1007/S10649-023-10218-z

Dolapcioglu, S., & Doğanay, A. (2022). Development of critical thinking in mathematics classes via authentic learning: an action research. International Journal of Mathematical Education in Science and Technology, 53(6), 1363–1386. https://doi.org/10.1080/0020739X.2020.1819573

Flores, C. D., López, M. I. R., & Moore-Russo, D. (2020). Conceptualizations of slope in Mexican intended curriculum. School Science and Mathematics, 120(2), 104–115. https://doi.org/10.1111/ssm.12389

Frank, K., & Thompson, P. W. (2021). School students’ preparation for calculus in the United States. ZDM - Mathematics Education, 53(3), 549–562. https://doi.org/10.1007/S11858-021-01231-8

Gan, W., Sun, Y., Peng, X., & Sun, Y. (2020). Modeling learner’s dynamic knowledge construction procedure and cognitive item difficulty for knowledge tracing. Applied Intelligence, 50(11), 3894–3912. https://doi.org/10.1007/S10489-020-01756-7

Glen, L., & Zazkis, R. (2021). On linear functions and their graphs: refining the cartesian connection. International Journal of Science and Mathematics Education, 19(7), 1485–1504. https://doi.org/10.1007/S10763-020-10113-6

Gurmu, F., Tuge, C., & Hunde, A. B. (2024). Effects of GeoGebra-assisted instructional methods on students’ conceptual understanding of geometry. Cogent Education, 11(1). https://doi.org/10.1080/2331186X.2024.2379745

Holtom, B., Baruch, Y., Aguinis, H., & A Ballinger, G. (2022). Survey response rates: Trends and a validity assessment framework. Human Relations, 75(8), 1560–1584. https://doi.org/10.1177/00187267211070769

Huang, Y. M., Silitonga, L. M., & Wu, T. T. (2022). Applying a business simulation game in a flipped classroom to enhance engagement, learning achievement, and higher-order thinking skills. Computers & Education, 183, 104494. https://doi.org/10.1016/j.compedu.2022.104494

Kabic, M., & Alexandrowicz, R. W. (2023). RMX/PIccc: An Extended Person–Item Map and a Unified IRT Output for eRm, psychotools, ltm, mirt, and TAM. Psych 2023, Vol. 5, Pages 948-965, 5(3), 948–965. https://doi.org/10.3390/psych5030062

Kong, S. C., & Wang, Y. Q. (2021). Item response analysis of computational thinking practices: Test characteristics and students’ learning abilities in visual programming contexts. Computers in Human Behavior, 122, 106836. https://doi.org/10.1016/j.chb.2021.106836

König, C., Spoden, C., & Frey, A. (2019). An optimized bayesian hierarchical two-parameter logistic model for small-sample item calibration. Applied Psychological Measurement, 44(4), 311–326. https://doi.org/10.1177/0146621619893786

Kop, P. M. G. M., Janssen, F. J. J. M., Drijvers, P. H. M., & van Driel, J. H. (2020). The relation between graphing formulas by hand and students’ symbol sense. Educational Studies in Mathematics, 105(2), 137–161. https://doi.org/10.1007/s10649-020-09970-3

Luong, R., & Flake, J. K. (2022). Measurement invariance testing using confirmatory factor analysis and alignment optimization: A Tutorial for transparent analysis planning and reporting. Psychological Methods, 28(4), 905–924. https://doi.org/10.1037/met0000441

Marsh, H. W., Guo, J., Dicke, T., Parker, P. D., & Craven, R. G. (2020). confirmatory factor analysis (CFA), exploratory structural equation modeling (ESEM), and set-ESEM: Optimal balance between goodness of fit and parsimony. Multivariate Behavioral Research, 55(1), 102–119. https://doi.org/10.1080/00273171.2019.1602503

Monroe, S. (2022). Item response theory. Item Response Theory. https://doi.org/10.4324/9781138609877-ree61-1

Murphy, D. H., Little, J. L., & Bjork, E. L. (2023). The value of using tests in education as tools for learning—not just for assessment. Educational Psychology Review, 35(3), 1–21. https://doi.org/10.1007/s10648-023-09808-3

Newton, K. J., Barbieri, C. A., & Booth, J. L. (2020). Key mathematical competencies from arithmetic to algebra. Oxford Research Encyclopedia of Education. https://doi.org/10.1093/acrefore/9780190264093.013.956

Oktaviyanthi, R., & Sholahudin, U. (2023). Phet assisted trigonometric worksheet for students’ trigonometric adaptive thinking. Mosharafa: Jurnal Pendidikan Matematika, 12(2), 229–242. https://doi.org/10.31980/mosharafa.v12I2.779

Olugbade, D., Oyelere, S. S., & Agbo, F. J. (2024). Enhancing junior secondary students’ learning outcomes in basic science and technology through PhET: A study in Nigeria. Education and Information Technologies, 29(11), 14035–14057. https://doi.org/10.1007/S10639-023-12391-3

Pinto, E., & Cañadas, M. C. (2021). Generalizations of third and fifth graders within a functional approach to early algebra. Mathematics Education Research Journal, 33(1), 113–134. https://doi.org/10.1007/S13394-019-00300-2

Rayan, B., Daher, W., Diab, H., & Issa, N. (2023). Integrating PhET simulations into elementary science education: A qualitative analysis. Education Sciences 2023, Vol. 13, Page 884, 13(9), 884. https://doi.org/10.3390/educsci13090884

Riani, M., & Robitzsch, A. (2024). Estimation of standard error, linking error, and total error for robust and nonrobust linking methods in the two-parameter logistic model. Stats, 7, 592-612, 7(3), 592–612. https://doi.org/10.3390/stats7030036

Roback, P., & Legler, J. (2021). Beyond multiple linear regression: Applied generalized linear models and multilevel models in R. Beyond Multiple Linear Regression. https://doi.org/10.1201/9780429066665

Scherman, V., & Liebenberg, L. (2021). Item response theory integrating qualitative data. The Routledge Reviewer’s Guide to Mixed Methods Analysis, 117–123. https://doi.org/10.4324/9780203729434

Shultz, K. S. ., Whitney, D. J. ., & Zickar, M. J. . (2021). Measurement theory in action : case studies and exercises. Routledge, Taylor and Francis Group. https://www.routledge.com/Measurement-Theory-in-Action-Case-Studies-and-Exercises/Shultz-Whitney-Zickar/p/book/9780367192181

Spiller, J., Clayton, S., Cragg, L., Johnson, S., Simms, V., & Gilmore, C. (2023). Higher level domain specific skills in mathematics; The relationship between algebra, geometry, executive function skills and mathematics achievement. PLOS ONE, 18(11), e0291796. https://doi.org/10.1371/journal.pone.0291796

Stachl, C. N., & Baranger, A. M. (2020). Sense of belonging within the graduate community of a research-focused STEM department: Quantitative assessment using a visual narrative and item response theory. PLOS ONE, 15(5), e0233431. https://doi.org/10.1371/journal.pone.0233431

Sugden, N., Brunton, R., MacDonald, J. B., Yeo, M., & Hicks, B. (2021). Evaluating student engagement and deep learning in interactive online psychology learning activities. Australasian Journal of Educational Technology, 37(2), 45–65. https://doi.org/10.14742/ajet.6632

Sweeney, S. M., Sinharay, S., Johnson, M. S., & Steinhauer, E. W. (2022). An Investigation of the Nature and Consequence of the Relationship between IRT Difficulty and Discrimination. Educational Measurement: Issues and Practice, 41(4), 50–67. https://doi.org/10.1111/emip.12522

Swiecki, Z., Khosravi, H., Chen, G., Martinez-Maldonado, R., Lodge, J. M., Milligan, S., Selwyn, N., & Gašević, D. (2022). Assessment in the age of artificial intelligence. Computers and Education: Artificial Intelligence, 3, 100075. https://doi.org/10.1016/j.caeai.2022.100075

Tang, H., & Bao, Y. (2024). Self-regulated learner profiles in MOOCs: A cluster analysis based on the item response theory. Interactive Learning Environments. https://doi.org/10.1080/10494820.2022.2129394

Trigueros, M., & Wawro, M. (2020). Linear algebra teaching and learning. Encyclopedia of Mathematics Education, 474–478. https://doi.org/10.1007/978-3-030-15789-0_100021

Wang, G., & Williamson, A. (2022). Course evaluation scores: valid measures for teaching effectiveness or rewards for lenient grading? Teaching in Higher Education, 27(3), 297–318. https://doi.org/10.1080/13562517.2020.1722992

Wasserman, N. H., Buchbinder, O., & Buchholtz, N. (2023). Making university mathematics matter for secondary teacher preparation. ZDM - Mathematics Education, 55(4), 719–736. https://doi.org/10.1007/s11858-023-01484-5

Wilson, Mark. (2023). Constructing measures: An item response modeling approach. Routledge. https://www.routledge.com/Constructing-Measures-An-Item-Response-Modeling-Approach/Wilson/p/book/9781032261683

Zakwandi, R., Istiyono, E., & Dwandaru, W. S. B. (2024). A two-tier computerized adaptive test to measure student computational thinking skills. Education and Information Technologies, 29(7), 8579–8608. https://doi.org/10.1007/s10639-023-12093-w

Downloads

Published

01-02-2025

How to Cite

Giyanti, Lestari, I., & Oktaviyanthi, R. (2025). Integration of item response theory in the development of PhET-based graphing lines worksheets for optimizing student algebra competence. Jurnal Elemen, 11(1), 153–170. https://doi.org/10.29408/jel.v11i1.27634

Issue

Section

Articles

Similar Articles

<< < 1 2 3 4 5 6 7 > >> 

You may also start an advanced similarity search for this article.