OptomCAS | OPTOMETRY ADMISSION TEST     

Optometric Education

The Journal of the Association of Schools and Colleges of Optometry

Optometric Education: Volume 45, Number 2 (Winter-Spring 2020)

PEER REVIEWED

Application of an Online Homework Tool in Optometry for Geometric Optics Improves Exam Performance

Varuna Kumaran, MS, B.Optom, Krishna Kumar, B.Optom, MPhil, PhD, and Naveen Mahesh, B.E.

Abstract

Geometric optics requires strong problem-solving skills that can be improved through practice. Due to time constraints, more practice in the classroom is not typically possible. An online homework application called Kognify (Kognify Assessment and Skill Development, PL, Chennai, Tamil Nadu, India) enables students to perform online “workouts” at their convenience. Thirty-four students used Kognify from July to September 2016 to practice problem-solving skills related to the course Geometric Optics-II (GO-II). This differed from the approach in previous semesters, during which only in-class, on-paper quizzes were used. When Kognify was used, students achieved better scores on mid-semester and comprehensive exams (p<0.02 in both cases) as well as in overall course performance (p=0.005).

Key Words: Kognify, geometric optics, training, online, digital

Background

Online homework has been replacing traditional paper-based homework in many fields, including chemistry, statistics, physics, accounting and mathematics; however, its impact on exam performance is ambiguous. While improvements have been observed in many studies,1-24 other studies show little or no improvement.25-32 Regardless, students and faculty have shown a strong preference for online homework systems.1-32 Students receive feedback on their homework performance instantly and automatically as practice problems are completed, which lowers the burden of evaluation for faculty members.8,15,21

The use of technology in optometric education is becoming more common. Recently, a study compared the use of digital assessments with paper-based tests in geometric optics.33 The creation of an online problem set for optics was also reported, but the report did not discuss its use by students.34 There are no publications to date that describe the effectiveness of online homework systems in the field of optometry.

This research paper describes the use of an online homework system — Kognify (Kognify Assessment and Skill Development, PL, Chennai, Tamil Nadu, India) — for training students of optometry in the Geometric Optics-II (GO-II) course. In previous semesters, 15- to 30-minute paper-based quizzes were used to assess GO-II students’ knowledge of the subject matter. The quizzes were conducted weekly during the 2.5- to 3-hour class periods. Each quiz assessed knowledge of the concepts that had been covered in the previous class. A teaching assistant evaluated the answer sheets within the following week. Any individual or common mistakes were discussed on a later date. This method was time-consuming but used because more practice for problem-solving and application of concepts was crucial at this early stage of learning. Students in these previous semesters felt the need for more faculty guidance in solving problems, but extra class hours could not be allotted due to time constraints. To address these concerns, Kognify was employed in 2016.

Kognify is an online homework application.35 Many school systems use Kognify in high school education.35 The faculty create a database of multiple-choice questions in Kognify along with topics and objectives that can be tagged to their questions. Students can access Kognify through the Google Chrome or Firefox web browsers or an Android app for mobile use. Groups of questions are presented randomly as “workouts” that are given to students on a regular basis. Faculty can customize the number of questions in a workout, the time limit and the number of workouts per week. Faculty monitor performance in terms of accuracy and response time. A report is generated and shared with each student for every topic. Concepts are then reinforced as needed during class time via student-teacher interactions.

We tested whether Kognify would improve exam performance compared with employing regular weekly quizzes. This paper reports data that support the efficacy of Kognify in improving exam performance scores in the GO-II course in optometry.

Methods

The study was conducted at the Elite School of Optometry, Chennai, Tamil Nadu, India, which is affiliated to Birla Institute of Technology and Science, Pilani, Rajasthan, India. The study compared two groups of students. The first group took GO-II in July to December of 2015 (CO2015). CO2015 consisted of 31 students (10 males and 21 females), ages 17-19 years as of January 2015 (mean age ±SD = 18.08 years ±0.41). The second group of students took GO-II in July to December 2016 (CO2016). CO2016 consisted of 34 students (9 males and 25 females), ages 17-19 years as of January 2016 (mean age ±SD = 17.98 years ±0.5). Note that South Asian Indian optometry students are younger than their North American counterparts because optometric education is an undergraduate degree program in India. However, the syllabi for geometric optics courses do not substantially differ between the two types of programs. The syllabi for the Geometric Optics-I (GO-I) and GO-II courses taught at Elite School of Optometry, Chennai, are shown in Appendix A.

Both CO2015 and CO2016 had the same syllabus, and the same faculty members taught their theory classes for both the GO-I and GO-II courses. Both classes went through the university-mandated continuous assessment process during the semester. This consisted of three evaluation components (EC1, EC2 and EC3), a comprehensive exam and a practical exam. Table 1 shows the breakdown of the total course grade. EC1 and EC3 were paper-based class assessments on topics covered for that month alone. EC2 (a mid-semester exam) and the comprehensive exam were scheduled written exams that covered topics taught up to those points. A common examination format was followed for all the semesters for EC2 and the comprehensive exam as suggested by the institution (Table 2).

The CO2015 students were trained using frequent in-class quizzes as described above. For the CO2016 students, Kognify was used as a replacement for the in-class quizzes.

Implementation of Kognify workouts for GO-II

Each CO2016 student was given a free password-protected user account. An initial training session on use of Kognify was given at the premises on July 18, 2016. Students were invited to perform timed workouts two to five days per week from July 19 to Sept. 16, 2016. Faculty added multiple-choice questions on a regular basis and tagged the questions with their topics and learning objectives. Five to 10 questions formed a part of each workout. The questions, as well as the answer choices, were randomized. The questions reflected topics covered in the class that week (every Monday). The concepts and problem-solving techniques delivered in the class were therefore revisited through the workouts.

Students with Android phones performed the workouts through their phones, while the rest used laptop or desktop computers. The students could log-in multiple times in each workout and had the option of re-attempting questions until they submitted the workout or the time expired. The time allotted for the workouts was liberal. Students were allowed to refer to books. Individual doubts were clarified via e-mail and WhatsApp messaging. The students were aware of their scores via the summary reports generated in their individual user accounts. Once most of the students had completed a workout, the assessment with the answer key was e-mailed to them for future reference. The students were encouraged and reminded to take these workouts, but no incentives were given to students to complete the workouts. The compliance and performance in these workouts were not considered toward the final scores in any way.

If a student wanted a repeat workout due to absence, a power outage, accidental logout or any other reason, it was arranged for them. If the faculty felt that unusually less time was taken, or a workout was badly performed (less than 50%), a repeat workout for only those students was arranged. The repeat workouts consisted of the same assignment, but the questions and answer choices were randomized. Workouts could be repeated only once. Table 3 provides details regarding the number of students who took repeat workouts. Before the mid-semester exam (EC2), a review workout was given for practice.

Students used Kognify from mid-July to mid-September 2016 until the comprehensive mid-semester exam (EC2), which included subject matter covered for EC1 (involving significant mathematical calculations, formulas and important concepts) (Table 4). Kognify was not employed prior to the end-of-semester comprehensive exam.

The Institutional Review Board (IRB) considered the study proposal and declared it exempt from a formal IRB approval.

Statistical analysis

Data analysis and plotting of graphs were performed using the statistical package RStudio, (R version 3.3.2, The R Foundation for Statistical Computing).36 The normality of the data distribution was tested using the Shapiro-Wilk test. First, GO-I scores for each of the classes were compared to establish a similar academic ability between the classes. The EC2 and comprehensive exam were compared for the GO-I course for CO2015 and CO2016 using the Mann-Whitney U test. Then, GO-II scores for the mid-semester exam (EC2) and final comprehensive exam for CO2015 were compared to those for CO2016 using the Mann-Whitney U one-sided test. It tested the alternative hypothesis that the GO-II scores of CO2016 were better than the GO-II scores of CO2015. A p-value of 0.05 was considered statistically significant in all analyses.

EC1 and EC3 were not separately analyzed because they were not assessed in a structured and common examination format across classes and semesters. Practical exam scores were also not compared separately.

Results

Because many scores in CO2016 were not normally distributed, nonparametric statistics were used. Normality was tested using the Shapiro-Wilk normality test for the comprehensive exam and EC2 (mid-semester exam) across all semesters. Distributions deviated from normality for the comprehensive exam (W=0.916, p=0.01209) for GO-I in CO2016. EC2 (W=0.911, p=0.009) for GO-II in CO2016 also lacked normal distributions.

The academic skills of CO2016 and CO2015 were similar, as confirmed by the absence of statistically significant difference in their GO-I scores [Mann-Whitney U test: EC2 exam (Med. diff.=0 marks, W=468.5); comprehensive exam (Med. diff.=3.50 marks, W=500.5); p>0.05 (not significant) in both the cases]. Table 5 summarizes the means, standard deviations, medians and 95% confidence intervals for the mid-semester exam (EC2) and comprehensive exam scores in GO-I and GO-II for CO2015 and CO2016.

Figure 1. Mid-semester exam (EC2) and comprehensive exam (Comp) Scores in Geometric Optics-I and Geometric Optics-II for CO2015 and CO2016. Click to enlarge

Figure 1 presents the box plots for scores obtained in GO-I and GO-II for both classes. Better scores are seen in the class that used Kognify compared with its CO2015 counterpart that used conventional practice methods [Table 5: Mann-Whitney U test: EC2 exam (Med. diff.=2.63 marks, W=369, p=0.0193); comprehensive exam (Med. diff.=7.31 marks, W=210, p<0.0001).

Discussion

The results of the study suggest that Kognify improved students’ performance over the conventional method of weekly on-paper quizzes. This is evident from the performance of the CO2016 students on their GO-II exams, which was much better when compared with the performance of the CO2015 students.

Concepts and problem-solving skills acquired in the GO-I and GO-II courses lay a strong foundation for other subjects such as visual optics, contact lenses, optometric optics, dispensing optics and low vision aids. Thus, reviewing concepts and practicing solving problems are essential. Remote faculty interaction with Kognify makes these goals achievable outside of time-constrained classroom hours. Apart from setting up the workouts and reviewing performance reports daily or weekly, faculty communicate with students individually about needed areas of improvement via email, SMS and WhatsApp. This further reduces dependency on additional teaching staff, who may instead be trained to set up Kognify workouts and analyze students’ performance.

Kognify provides instant feedback, a feature of great benefit to students and teachers, as with other online homework systems.8,15,21,27,39-41 In addition, Kognify provides a summary report of student performance across all topics covered. Feedback from Kognify coupled with off-line comments from faculty increased student motivation to study, as reported by students during informal conversations with faculty.

Kognify also helps to build rapport between faculty and students. Students who were generally hesitant to seek help in the classroom were given a platform from which to reach out to faculty or peers on a regular basis. Such benefits of online homework systems have been reported previously as well.4,5

Adoption of Kognify for the current study went smoothly except for a couple of instances. One student forgot her password and it had to be reset. Also, despite several reminders, 50% of the students (17 of 34) missed one or more workouts. One student didn’t use the system fully due to health issues and completed only 9 of the 27 workouts. Students who missed two or fewer workouts had better scores on the mid-semester (EC2) and comprehensive exams, but this negative correlation is weak (p=-0.24, Spearman’s rank correlation) (Figure 2). Many factors influence performance on exams; therefore, additional studies can be conducted to identify the population, based on skill and motivation level, that will benefit most from online homework.

Figure 2. Scatter plot showing Geometric Optics-II mid-semester exam (EC2) scores for batch CO2016 vs. number of Kognify workouts missed. Click to enlarge

In the current study, the students were given the liberty of multiple logouts and repeat tests. Although encouraged to take workouts without any help, they had the opportunity to re-learn a concept and re-do a question. Each student has a different learning strategy, which can influence performance on exams. Future studies can evaluate the effect of study habits (e.g., average time spent on workouts, number of repeat attempts and performance on online workouts) on final exam performance, as has been done elsewhere.43-45

The students were initially highly motivated to perform the Kognify workouts, but participation dropped over the weeks. With the burden of other subjects and activities, they needed more reminders to complete their tasks. Nevertheless, the minimum compliance was 61% (21 of 34 students participated across all workouts). To make students more participative, timely completion of Kognify workouts and scores in the workouts can be considered in determination of final grades, as suggested elsewhere.46-48 Further, if students bear the cost of the Kognify subscription, better compliance may be expected. Dedication and motivation level of faculty members, teaching assistants and students are important to the success of online homework systems, as reported earlier.4,13,49

Although this study took a quasi-experimental approach, it can pave the way for future prospective, randomized, controlled studies. A questionnaire to gauge students’ satisfaction with Kognify would be useful in the future.

Conclusion

This study suggests that online homework systems such as Kognify can be effective in training optometry students in problem-solving skills for geometric optics courses. Kognify can be useful for students facing qualifying exams, fellowship exams and board exams. As experienced in this study, Kognify can help faculty to plan classroom time to review concepts taught earlier and to clarify student questions before proceeding to the next lecture. It can also help students better understand the concepts with well-planned workouts that can be used anywhere, anytime.

Acknowledgments

We thank Sarala Arumugam for creating the user accounts and providing technical support for this project. We also thank the students in the class of 2013-2017 for their valuable feedback that prompted the study, the 2014-2018 students whose data were used in the analyses, and the 2015-2019 students who used Kognify.

References

  1. Bartlett JE, Reynolds KA, Alexander MW. A tool for online learning. Journal of Online Learning. 2000;11(3-4):22-24.
  2. Dufresne RJ, Mestre J, Hart DM, Rath KA. The effect of web-based homework on test performance in large enrollment introductory physics courses. Journal of Computers in Mathematics and Science Teaching. 2002;21(3):229-251.
  3. Arasasingham RD, Taagepera M, Potter F, Martorell I, Lonjers S. Assessing the effect of web-based learning tools on student understanding of stoichiometry using knowledge space theory. J Chem Educ. 2005;82(8):1251-1262.
  4. Arasasingham RD, Martorell I, McIntire TM. Online homework and student achievement in a large enrollment introductory science course. J Coll Sci Teach. 2011;40(6):70-79.
  5. Richards-Babb M, Drelick J, Henry Z, Robertson-Honecker JR. Online homework, help or hindrance? What students think and how they perform. J Coll Sci Teach. 2011;40(4):81-93.
  6. Gaffney MA, Ryan D, Wurst C. Do online homework systems improve student performance? Advances in Accounting Education. 2010;11:49-68.
  7. Wooten T, Dillar-Eggers J. An investigation of online homework: required or not required? Contemp Issues Educ Res. 2013;6(2):189-198.
  8. Burch KJ, Kuo YJ. Traditional vs. online homework in college algebra. Mathematics and Computer Education. 2010;44(1):53-63.
  9. Arora M, Rho Y, Masson C. Longitudinal study of online statics homework as a method to improve learning. J STEM Educ. 2013;14(1):36-44.
  10. Chak SC, Fung H. Exploring the effectiveness of blended learning in cost and management accounting: an empirical study. New Media, Knowledge Practices and Multiliteracies. 2015;13:189-203.
  11. Lazarova K. The role of online homework in low-enrollment college introductory physics courses. J Coll Sci Teach. 2015;44(3):17-21.
  12. Palocsay SW, Stevens SP. A study of the effectiveness of web-based homework in teaching undergraduate business statistics and decision sciences. Journal of Innovative Education. 2008;6(2):213-232.
  13. Halcrow C, Dunnigan G. Online homework in Calculus I: friend or foe? Problems, resources, and issues in mathematics undergraduate studies. 2012;22(8):664-682.
  14. Taraban R, Anderson EE, Hayes MW, Sharma MP. Developing on-line homework for introductory thermodynamics. Journal of Engineering Education. 2005;94(3):339-342.
  15. Zerr R. A quantitative and qualitative analysis of the effectiveness of online homework in first-semester calculus. Journal of Computers in Mathematics and Science Teaching. 2007;26(1):55-73.
  16. Cuadros J, Yaron D, Leinhardt G. “One firm spot”: the role of homework as lever in acquiring conceptual and performance competence in college chemistry. J Chem Educ. 2007;84(6):1047-1052.
  17. Gebru MT, Phelps AJ, Wulfsberg G. Effect of clickers versus online homework on students’ long-term retention of general chemistry course material. Chemistry Education Research and Practice. 2012;13:325-329.
  18. Dufresne R, Mestre J, Doorn D, Janssen S, O’Brien M. Student attitudes and approaches to online homework. International Journal for the Scholarship of Teaching and Learning. 2010;4(1)1-19.
  19. Harris H. Electronic homework management systems: review of popular systems. J Chem Educ. 2009;86(6):691.
  20. Fynewever H. A comparison of the effectiveness of web‐based and paper‐based homework for general chemistry. Chem Educ. 2008;13:264.
  21. Malik K, Martinez N, Romero J, Schubel S, Janowicz PA. Mixed-methods study of online and written organic chemistry homework. J Chem Educ. 2014;91(11):1804-1809.
  22. Richards‐Babb M, Jackson JK. Gendered responses to online homework use in general chemistry. Chemistry Education Research and Practice. 2011;12(4):409‐419.
  23. Freasier B, Collins G, Newitt P. A web-based interactive homework quiz and tutorial package to motivate undergraduate chemistry students and improve learning. J Chem Educ. 2003;80(11):1344-1347.
  24. Ichinose C. Students’ perceptions when learning mathematics online: a path analysis. Journal of the Research Center for Educational Technology. 2010;6(2):78-93.
  25. Fisher L, Holme T. Using web-based databases in large-lecture chemistry courses. Chem Educ. 2000;5(5):269-276.
  26. Chamala R, Ciochina R, Grossman R, Finkel R, Kannan S, Ramachandran P. EPOCH: An organic chemistry homework program that offers response-specific feedback to students. J Chem Educ. 2006;83(1):164-169.
  27. Cole RS, Todd JB. Effects of web‐based multimedia homework with immediate rich feedback on student learning in general chemistry. J Chem Educ. 2003;80:1338‐1343.
  28. Bonham S, Beichner R, Deardorff D. Online homework: does it make a difference? Phys Teach. 2001;39:293‐296.
  29. Bonham SW, Deardorff DL, Beichner RJ. Comparison of student performance using web and paper-based homework in college-level physics. Journal of Research in Science Teaching. 2003;40(10):1050-1071.
  30. Cheng KK, Thacker B, Cardenas RL, Crouch C. Using an online homework system enhances students’ learning of physics concepts in an introductory physics course. American Journal of Physics. 2004;72:1447‐1453.
  31. Demirci N. Developing web-oriented homework system to assess students’ introductory physics course performance and compare to paper-based peer homework. Turkish Online Journal of Distance Education. 2006;7(3):105-119.
  32. Humphrey RL, Beard DF. Faculty perceptions of online homework software in accounting education. Journal of Accounting Education. 2014;32(3):238.
  33. Fecho GM, Althoff J, Hardigan P. Assessing student performance in geometrical optics using two different assessment tools: tablet and paper. Optometric Education. Fall 2016;42(1).
  34. Kollbaum PS, Jackson JM, Koh B. Development of an adaptable online optometric quiz site [Internet]. Proceedings of the American Academy of Optometry 91st Annual meeting; 2012 Oct 24-27; Phoenix (AZ); [cited 2017]. Program No. 125460. Available from: https://www.aaopt.org/detail/knowledge-base-article/development-adaptable-online-optometric-quiz-site
  35. Kognify [Internet]. Kognify assessment and skill development private limited, Chennai, Tamil Nadu, India; c2016. [Cited 2016]. Available from: https://www.kognify.com
  36. R Development Core Team (2016). R: A language and environment for statistical computing. [Internet]. Vienna, Austria. Available from: https://www.R-project.org
  37. Brewer D, Becker K. Online homework effectiveness for underprepared and repeating college algebra students. Journal of Computers in Mathematics and Science Teaching. 2010;29(4):353-371.
  38. Pundak D, Maharshak A, Rozner S. Successful pedagogy with web assignments checker. Journal of Educational Technology Systems. 2004;33(1):67-80.
  39. Liberatore MW. Improved student achievement using personalized online homework for a course in material and energy balance. Chem Eng Educ. 2011;45(3):184-190.
  40. Axtell M, Curran E. The effects of online homework in a university finite mathematics course. In: Larsen S, Marrongelle K, editors. Proceedings of the 14th Annual Conference on Research in Undergraduate Mathematics Education; 2011 Feb 24-27; Portland (OR). (4):20-23.
  41. Carpenter J, Camp B. Using a web-based homework system to improve accountability and mastery in calculus. Proceedings of the 2008 ASEE Annual Conference and Exposition; 2008 Jun 22-25; Pittsburgh (PA); [cited 2008]. Available from: https://peer.asee.org/4426
  42. Brewer DS. The effects of online homework on achievement and self-efficacy of college algebra students [dissertation]. [Logan (UT)]: Utah State University; 2009. 228p. Available from: https://digitalcommons.usu.edu/etd/407
  43. Bowman CR, Gulacar O, King DB. Predicting student success via online homework usage. Journal of Learning Design (Science Education). 2014;7(2):47-61.
  44. Richards-Babb M, Curtis R, Georgieva Z, Penn JH. Student perceptions of online homework use for formative assessment of learning in organic chemistry. J Chem Educ. 2015;92(11):1813-1819.
  45. Mitchell JC, Mitchell JE. Using web-based homework to teach principles of microeconomics: A preliminary investigation. American Journal of Business Education. 2017;10(1):9-16.
  46. Parker LL, Loudon GM. Case study using online homework in undergraduate organic chemistry: results and student attitudes. J. Chem. Educ. 2013;90(1):37-44.
  47. Walvoord BE, Anderson VJ. Effective grading: A tool for learning and assessment in college. 2nd ed. San Francisco: Jossey‐Bass; 2010.
  48. Cannonier C, Chen DC, Smolira J. The effect of a homework grade cap in an introductory finance class [Internet]; c2015 [cited 2015 Aug 24]. Available from: https://ssrn.com/abstract=2652425
  49. Chan SH, Song Q, Rivera LH, Trongmateerut P. Using an educational computer program to enhance student performance in financial accounting. Journal of Accounting Education. 2016;36:43-64.

Disclaimer

Naveen Mahesh is the founder of Kognify Assessment and Skill Development, PL. While he was instrumental in providing the idea for this study, he was not involved in the data collection and analyses for the study. Execution of the study, including creation of the questions database and collection and analyses of data, was independent of any influence from the Kognify company. Varuna Kumaran and Dr. Krishna Kumar had no financial agreement with Kognify Assessment and Skill Development, PL, to conduct this study.

Appendix A. Click to enlarge

 Save article as PDF

Varuna Kumaran [varuna_p@yahoo.com] is a visiting faculty member at the Elite School of Optometry, Chennai, Tamil Nadu, India, and has been teaching geometric optics since 2014. She is also a visiting faculty member and involved with research projects at other optometry schools.

Dr. Kumar is the Principal at the Elite School of Optometry, Chennai, Tamil Nadu, India, and has various publications to his credit. He specializes in the areas of low vision aids and occupational optometry.

Naveen Mahesh is Managing Trustee with Headstart Learning Centre International, Tamil Nadu, India. He is a serial entrepreneur and has founded many successful initiatives such as Headstart Learning Centre (IGCSE School), Explorers Basketball Club, Militvaa (entrepreneurship challenge), Karthavyam (public problem-solving diploma), Elina (integrated services for special education), Beyond 8 (ecosystem for continuous learning) and Kognify (learning with understanding). Mr. Mahesh spent many years in the United States before becoming interested in education and learning in India. What he initially started as learning experiments in schools 15 years ago has become a habit of innovation in education redesign. He is passionate about getting schools to meet the global capacity challenge using emerging innovations and dynamic solutions.