OptomCAS | OPTOMETRY ADMISSION TEST     

Optometric Education

The Journal of the Association of Schools and Colleges of Optometry

Optometric Education: Volume 46 Number 1 (Fall 2020)

PEER REVIEWED

Using an Audience Response System in a First-Year Optics Course: Student Perceptions and Outcomes

Frank Spors, EurOptom, MS, PhD, Dorcas K. Tsang, OD, Krystle Golly, MS, and Joseph Gray, OD

Abstract

Background: This study assessed student perceptions and outcomes of using the Top Hat audience response system (ARS) in conjunction with small-group discussions to facilitate active learning in a first-year geometrical and ophthalmic optics course. Methods: Throughout each 90-minute lecture, 5-10 ARS questions were placed to focus the students’ attention and to provide formative feedback on the students’ comprehension of the material. The students used their mobile devices to answer the questions following short, small-group discussions. At the conclusion of the course, an anonymous survey was administered to evaluate student experiences with the use of the Top Hat ARS. In addition, the scores of 36 final examination items were compared with the scores from previous academic years during which no ARS was utilized. Results: Most students stated that ARS questions were beneficial, helped to maintain attention during class, and they therefore favored this instructional strategy. They preferred 3-10 multiple-choice questions during most or all 90-minute lectures. When examination scores of 36 questions administered in different cohorts enrolled in the same course in two successive years were analyzed, the group working with the Top Hat ARS during lectures showed a small, but statistically significant mean improvement. (P-value 0.021) Conclusions: Utilization of the Top Hat ARS in conjunction with small-group discussions during face-to-face lectures is a valuable teaching tool, which helps students maintain attention and may improve academic performance.

Key Words: Top Hat audience response system, ARS, small-group discussions, active learning

Background

Face-to-face lectures are often used for delivering significant amounts of content to large groups of students. The effectiveness of one-way lectures that do not actively engage students is limited. Adults passively attending a lecture have a maximum attention span of 15 to 20 minutes, and research has shown a low rate of information retention when audience members are passive participants in the learning process.1-3 On the other hand, students who actively engage during lecture — who interact with one another as well as with the instructor — have a better understanding of the lecture material, retain it longer, and are more able to apply the concepts in other contexts.4 While the ability to maintain attention in class may depend on the student’s level of motivation, interest and cognitive processing aptitude, attention itself may be influenced by the presentation of the lecture material, over which the instructor exerts full control.5,6 To actively engage students during lecture, many instructors have implemented innovative technologies.3

These technologies include audience response systems (ARS), which have been increasingly used in education. An ARS is a combination of hardware and software that enables the instructor to pose real-time questions to students. The required hardware may be physical “clickers,” or the responder’s smartphone, laptop or tablet. During the live presentation of class materials, students are prompted to answer questions using their response devices. As responses are submitted via the software portal, the software instantaneously analyzes and displays a histogram showing the distribution of responses among answer choices. This particular tool not only assesses student learning to provide the instructor immediate feedback, but also promotes participation and engagement essential in active learning. Therefore, an ARS is beneficial to both the students and the instructor during the ongoing lecture.7 The instructor can instantly decide if a topic needs to be reviewed or if the lecture can proceed, while the students can self-reflect on their level of understanding of the concept just covered.8 Compared with traditional lectures, sessions using an ARS allow implementation of a variety of question types, including multiple-choice, multiple-response, fill-in-the-blank, matching, ordering, free-text response and hot-spot (click-on-target) picture questions. When used appropriately, this takes advantage of the valuable face-to-face time in class and can support active learning by requiring students to be engaged in higher-order thinking and analysis to synthesize and evaluate the materials presented to them in the classroom.9 According to Premkumar and Coupal, a 90-minute lecture should have a minimum of 5-6 engagement questions.10 Approximately 3-4 minutes should be devoted to the administration of each question, its response collection and any follow-up discussion of the correct solution. The utilization of the ARS provides an opportunity to improve the quality of analysis and discussion of lecture content, while at the same time providing immediate feedback to the course instructor.11 Therefore, it can play a vital role in transforming didactic teacher-centered lectures into interactive learner-centered environments, where students can engage in peer discussions and collaborative learning.12 To efficiently use an ARS throughout a lecture, a selection of assessment questions can be utilized. The three primary categories of ARS assessments are: factual recall, conceptual understanding and knowledge application. Factual recall assessments are often used to determine if students have done assigned readings or paid attention to rules and concepts stated during the lecture. Conceptual assessments require students to create answers rather than recall them and tend to generate more substantive discussions. Knowledge application assessments require analysis of concepts in different contexts and lead to a higher order of learning.3

Despite its great potential, using an ARS per se does not guarantee improved face-to-face lectures, actively engaged students or improved student learning. In a 2004 commentary, biology professor William B. Wood noted, “Like any technology, these systems are intrinsically neither good nor bad; they can be used skillfully or clumsily, creatively or destructively.”13 The implementation of pedagogical strategies in combination with ARS technology is what ultimately influences student success.14 The reasoning for using an ARS should be explained to the students at the beginning of the course. Conceptual questions, each focusing on key points of the lecture, need to be prepared prior to the lecture, and students should be motivated to engage with the material and to answer the questions. In this context, the discussion of the presented ARS questions among the students is important. While the instructor strives to present the material as clearly as possible during an ongoing lecture, some students might still misunderstand or analyze the content incorrectly. On other occasions, students might understand the lecture material but misunderstand an ARS question. Both problems may be corrected by the students themselves during small-group discussions, a process that was introduced as Peer Instruction by physics professor Eric Mazur.15

Mazur’s Peer Instruction method is a three-step process. The first step involves introducing an ARS question for formative assessment and having the students submit their responses individually. Students are then asked to engage in small-group discussions to convince each other of the correctness of their own answer by explaining their underlying analysis. In the last step, the instructor presents the same question, polls the students again, and provides support for the correct solution.16 Peer Instruction is a proven method to improve student learning for the mastery of lecture material.17 It is, however, time-consuming and requires that students complete reading assignments on the lecture topics before coming to class. An alternative approach is to have students engage in small-group discussions right after the question is assigned, which is then followed by one-time voting by each student.14,18,19 Unlike asking informal questions during a lecture, which typically engages only a few highly motivated students, an advantage of answering questions following small-group discussions is that it involves every student. In this context, the use of an ARS is vital for soliciting responses of all students in the class.

A substantial amount of literature documents improvements in student motivation and engagement in higher education due to the use of ARS.3,7,8,11-14,18,20-26 Although students tend to prefer a teaching style that incorporates an ARS during lecture, reports on whether this has a positive effect on examination grades have shown mixed results.2,18-20,24-29 To examine the potential impact of an ARS on students’ performance on graded tests and to optimize its use in class, their input regarding their perceptions of ARS in the learning process is crucial. To investigate student perception of ARS in the learning process, the study reported here involved first-year optometry students who participated in a geometrical and ophthalmic optics course during which an ARS was utilized in every 90-minute lecture.

The purpose of this study was two-fold. The first purpose was to examine student experiences with the Top Hat ARS with feedback regarding the administration of ARS questions throughout the course. The second purpose was to analyze the inter-cohort performance on examination items compared with performance in the same course taught the previous year during which no ARS was used.

Methods

Audience response system

The ARS used during this study is a web-based platform called Top Hat (Tophatmonocle Corp., Toronto, Canada). It leverages the mobile devices that students already own, such as smartphones, laptops and tablets. The Top Hat platform does not require students to purchase clickers. Instead, they purchase a license to use the software on their mobile devices. At the time this paper was written, the regular pricing was $48 per student for one academic year, but special pricing may apply based on institutional agreements. In addition to ARS questions, the software can host all lecture materials, including presentations, text documents and videos. Therefore, it is possible to quickly set up questions, move them within presentations if needed, and administer lectures completely out of the Top Hat platform. Because a gradebook function is included, the software can host entire courses. Each session or quiz has a unique Join Code. If required, geofencing can be enabled by the instructor for each session to ensure that only students who are physically in attendance can participate. Typically, a student roster is linked from the learning management system to the Top Hat platform, which enables a variety of functions such as formative and summative assessments, segmentation of questions, targeted item analysis, attendance tracking and certain gradebook items to sync at the discretion of the instructor. In addition, extensive reports and analyses can be generated for monitoring individual student performance throughout a course, which is helpful in identifying at-risk students. The Top Hat software allows questions to be delivered in multiple ways: during lecture, assigned as homework, or assigned for review. Figures 1-3 show the in-class look.

Figure 1. Example of what students see prior to the display of lecture slides and audience response questions. The Join Code is unique for each presentation or quiz and is only required at the beginning of a session. The instructor can start or cancel the presentation from this screen and take attendance. The display format can be customized via the options at the bottom right. Click to enlarge

Figure 2. Example of how a multiple-choice question is displayed during administration. A question timer can be added when the question is constructed or when it is administered via the button in the top right corner. The bottom navigation bar allows the instructor to open, skip or close the question; to show the students’ responses after the question is closed; and to display the correct answer. Click to enlarge

Figure 3. Example of a hot-spot (click-on-target) question after the students have submitted their responses. A warmer color indicates an increased number of responses. The area for the correct response can be defined during the construction of the question.
Click to enlarge

Participants and course

Eighty-four first-year optometry students at Western University of Health Sciences College of Optometry, Pomona, Calif., used their smartphones, laptops or tablets as their ARS response devices to answer questions strategically placed in each 90-minute lecture of a geometrical and ophthalmic optics course. The semester-long course was the first in a sequence of four optics courses and was delivered in a traditional face-to-face format. The slide decks of all lectures were uploaded into the Top Hat platform, so the questions could be placed within each slide deck. Attendance in the course was mandatory, and the responses to the ARS questions served as the attendance tracker. This was the first encounter the students had with an ARS in the optometry program.

ARS questions and small-group discussions

At the beginning of the course, the instructor explained that ARS questions would be used for formative assessment in conjunction with small-group discussions, and demonstrated how to access and use the Top Hat platform. Throughout the course, the instructor inserted 5-10 ARS questions into each lecture to focus students’ attention and to provide feedback on students’ comprehension of the material. The questions were designed as conceptual questions and focused on the different key concepts of the ongoing lecture. Each question was administered directly following the lecture coverage of a particular key concept. When a question was presented, the students had 2-3 minutes to discuss possible solutions with their immediate neighbors, come to a conclusion, and individually submit their answers. Afterwards, the correct answer was displayed and explained by the instructor. Therefore, a total of 3-4 minutes was spent on each question. Then the instructor moved on to the next lecture topic.

Utilizing the ARS served as a tool for the instructor to gauge the pace of each lecture and to add additional in-class examples and explanations when indicated. Because the students responded to questions with their own mobile devices and were not restricted by the limited functionality of physical clickers, a variety of question types could be utilized, although most questions were in multiple-choice and multiple-answer format. Depending on the question, 4-6 answer choices were presented. Occasionally, fill-in-the-blank, sorting/matching, free-response and hot-spot (click-on-target) questions (Figure 3) were administered. A breakdown of questions by type used throughout the course is shown in Table 1. The lecture material presented in each class was simultaneously broadcast to the students’ mobile devices.

Data collection and analysis

At the conclusion of the course, prior to releasing course grades, the instructor administered an anonymous survey to receive the students’ input and evaluate their experiences using the Top Hat ARS. The survey questions are shown in Table 2. In addition, the instructor identified a total of 36 questions that had been administered in the cumulative final examinations of the optics course during the current academic year as well as during the previous academic year. In that previous year, 85 students were enrolled in the course, and no ARS was utilized. The placement of the course within the academic year, its duration and the content covered were the same in both years. In addition, the same instructor taught the courses. The only difference was the utilization of the ARS.

A retrospective cohort analysis was used to compare the scores per question between the two student groups. After the Kolmogorov-Smirnov normality test was passed, a two-tailed paired samples t-test was performed to compare the average scores for each question to the scores from the prior academic year. A P-value less than 0.05 was considered to be statistically significant. Prism 7 software (GraphPad Software, San Diego, Calif.) was used for conducting the statistical analysis. The project was approved by Western University of Health Sciences’ Institutional Review Board (19/RFD/023 X19/IRB/061).

Results

Seventy-three first-year optometry students participated in the survey. Of those, 94.5% favored an instructional strategy that allowed using an ARS during lectures; 93.2% reported that the ARS questions helped them to maintain attention during class; 87.7% of the students recommended using an ARS in most or all lectures; and 94.5% preferred 3-10 ARS questions during a 90-minute lecture. Although a range of question types was available, 96% of the students in this cohort preferred multiple-choice questions. The detailed student response distributions to the survey questions are depicted in Figure 4.

When examination scores of 36 questions administered in different cohorts enrolled in the same course in two successive years were analyzed, the group working with the Top Hat ARS during lectures showed a small, but statistically significant, mean improvement of 3.7% (SD 9.3), from 83.6% (SD 15.0) to 87.3% (SD 11.1). (P-value 0.021, paired-samples t-test, two-tailed, t = 2.41). In addition, no questions scored below 50%, whereas two questions scored lower than 50% when no ARS was used. The number of questions scoring 90% or higher increased from 21 without the use of an ARS to 26 with the use of an ARS. This indicated that academically strong as well as academically weaker students benefited from the intervention. The distribution of the scores is shown in Figure 5.

Figure 4. Distributions of student responses to survey questions regarding their experience with and suggestions for using the Top Hat ARS throughout the course.
Click to enlarge

Figure 5. Frequency distributions of average scores of 36 final examination items in the same course in two academic years. The average scores of each question were grouped into eight bins, starting at 25%. The x-axis values indicate the centers of each 10% bin. The gray bars show scores for the semester when no ARS was used. The red bars show scores for the semester when the Top Hat ARS was used.
Click to enlarge

Discussion

The instructor utilized the web-based Top Hat ARS in conjunction with small-group discussions during face-to-face lectures of a semester-long first-year geometrical and ophthalmic optics course. The goals were to improve student motivation and attention during lectures; to provide immediate feedback to the instructor concerning student comprehension; and to improve student performance on the cumulative final course examination. Based on student feedback, the instructor’s experiences throughout the course, and improved examination scores compared to the prior academic year during which no ARS was utilized, the instructor believes the goals were met.

The students’ feedback regarding the ARS was overwhelmingly positive, and most students believed that its use helped their in-class attention, which is one of the fundamental requirements for effective learning. This is supported by other studies, which reported that students find an ARS during lecture generally helpful and feel that it stimulates participation, engagement and interaction between students, and that it motivates them to learn.3,7,8,11-14,18,20-26 The students in this study recommended using 3-10 ARS questions during most or all 90-minute lectures. According to Premkumar and Coupal, one engagement question every 15-20 minutes should be administered to keep the attention of students during lecture, and a higher number of questions should be used if the purpose is formative assessment or review.10

Throughout the course, the administration and discussion of ARS questions took time and required careful planning of each lecture, assigning pre-lecture reading, and deciding on the types and number of ARS questions to be utilized. Active learning requires a high quality of participation, where students have an opportunity to interact with each other, the instructor and the lecture material.14 To allow a high quality of participation and to exercise the students’ ability to think critically, the instructor emphasized higher-order ARS questions and implemented small-group discussions. In addition, the ”safe environment” that the Top Hat ARS provided through the anonymity of its in-class responses gave students the ability to test their knowledge without fear of judgement. It is known that utilizing an ARS during lectures encourages the participation of otherwise reluctant or “shy” students.18,21

Because course attendance was mandatory, using the ARS responses to track attendance served as an incentive for the students to engage in small-group discussions and to answer the questions. For courses without mandatory attendance, several studies have suggested associating a portion of the course grade with the use of an ARS to ensure a high level of participation.21,23,30 Throughout the course, utilizing ARS questions gave immediate feedback for the instructor to determine the level of understanding of the material by the students. This allowed further discussion and clarification of subjects if needed, and direction of each lecture accordingly. Even though multiple types of questions were presented throughout the semester, most students preferred multiple-choice questions. One possible reason is that most of the presented questions were multiple-choice and the students became accustomed to this format. Another reason might be that, from a technological standpoint, it is easier to respond to a multiple-choice question because it requires only selecting the answer choice. In addition, standardized tests primarily utilize multiple-choice questions, and students might have preferred this question style in order to become prepared for the course examinations. Furthermore, it is known that students perceive multiple-choice questions as assessing knowledge-based cognitive processing.31 Therefore, the instructor should assure that, regardless of the type of question, higher levels of intellectual skills and abilities such as analysis, application and comprehension are also evaluated when designing assessment questions.

On 36 examination items, the student cohort using the Top Hat ARS during class showed an improvement in average question score from 83.6 (SD 15.0) to 87.3 (SD 11.1). In addition, the standard deviation of distributed scores decreased, which indicated that students engaged with the material through the ARS questions and became overall more proficient in correctly answering comparable types of questions when presented on the examination. Even though there are mixed opinions in the literature, several other studies reported similar outcomes. For example, studies by Poulis et al., Schackow et al., Yourstone et al., Mayer et al. and Levesque et al. reported that students who used an electronic ARS showed significantly higher scores on quizzes and examinations compared with control groups that did not use an ARS.2,20,26-28

Other studies did not always show an increase in test performance associated with the in-class use of an ARS, and several explanations have been discussed. Stoddard and Piquette reported that enhancing lectures by adding examination-style questions, and not necessarily a specific ARS technology itself, resulted in improved examination performance.29 Crossgrove and Curran found that the additional introduction of an ARS did not result in improved test performance because active learning strategies were already implemented in a prior administration of the course.24 In a study by Fitzpatrick et al., the authors found that the implementation of an ARS improved student performance in introductory level courses but not in senior-level courses, and suggested that students have mastered effective learning in senior-level courses.25 Because the study reported here also showed improved student performance in an introductory level course, a future follow-up study on the same cohort in their advanced optics course may provide additional insights related to Fitzpatrick et al.’s suggestion.

In addition, the study reported here found that the number of questions scoring below 50% decreased from two without the use of an ARS to zero with the use of an ARS. The number of questions scoring 90% or greater increased from 21 without use of an ARS to 26 with use of an ARS. This implied that academically strong as well as academically weaker students benefited from the intervention. In particular, the improvement of very low scores indicated that the ARS activity helped to correct conceptual misunderstandings or misinterpretation of assessment questions. The improved examination performance could be an indication of students experiencing increased confidence or improved capacity for solving problems, effects which are linked to ongoing formative feedback during class.24 Because this feedback can be efficiently delivered via the use of an ARS, it contributes to active learning.26

Some challenges and limitations were experienced during this study. First, it was not ascertained whether students had previous experience with an ARS platform or if the ARS was challenging to use throughout the course. By the end of the semester, it became known that the students overall had a positive experience using the software. In addition, different cohorts of students from two successive academic years were compared. Even though there was little variation in the average academic strength of a student cohort, these were different groups of students. Content, duration and placement of the courses were similar, and the same instructor taught the courses; however, it is not possible to exactly replicate the same course experience in different years.

Despite these limitations, the study results suggested the Top Hat ARS was a valuable tool for keeping students engaged in lectures throughout the course and for facilitating better performance on graded examination questions. A variety of factors likely contributed to this improvement. First, the students were consistently stimulated to pay attention to the ongoing lecture and interact with the material when ARS questions were posed. Second, the small-group discussions provided input from the students’ peers. Third, the correct answer for each ARS question was explained by the instructor before moving on to the next lecture topic. Fourth, having ARS questions in every lecture fostered the habit of problem-solving at the time of presentation.

Conclusion

The incorporation of an audience response system into face-to-face lectures can be a valuable teaching tool in facilitating active learning. It provides an opportunity to improve students’ quality of analysis and discussion of lecture content, and at the same time gives immediate feedback to the course instructor. In addition, the ARS allows students the ability to test their knowledge without fear of judgement, and therefore encourages the participation of otherwise reluctant students. Because the implementation of the ARS occupies lecture time, it is necessary to carefully plan each session, assign pre-reading, and decide on the types and number of ARS questions to be utilized. To promote broad and sustained participation in ARS activities, an incentive may be provided, such as associating a portion of the course grade with the use of an ARS. To test the effects of utilizing an ARS in senior-level courses, future study considerations will be to evaluate its effects in a more advanced optics course.

Disclosure

The authors report no conflicts of interest related to this work.

References

  1. Picciano A, Winter R, Ballan D, Birnberg B, Jacks M, Laing E. Resident acquisition of knowledge during a noontime conference series. Fam Med. 2003 Jun;35(6):418-22.
  2. Schackow TE, Chavez M, Loya L, Friedman M. Audience response system: effect on learning in family medicine residents. Fam Med. 2004 Jul;36(7):496-504.
  3. Collins LJ. Livening up the classroom: using audience response systems to promote active learning. Med Ref Serv Q. 2007 Jan;26(1):81-8.
  4. Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan R, et al. Scientific teaching. Science. 2004 Apr;304(5670):521-2.
  5. Pashler HE. The psychology of attention. Cambridge, MA: MIT Press; 1999. 494 p.
  6. Wilson K, Korn JH. Attention during lectures: beyond ten minutes. Teach Psychol. 2007 Jun;34(2):85-9.
  7. Gousseau M, Sommerfeld C, Gooi A. Tips for using mobile audience response systems in medical education. Adv Med Educ Pract. 2016 Dec;7:647-52.
  8. Pate CB, Steele EA. Initial experiences with an audience response system in the optometric classroom. Optometric Education. 2009 Feb;34(2):71-7.
  9. Bonwell CC, Sutherland TE. The active learning continuum: choosing activities to engage students in the classroom. New Dir Teach Learn. 1996 Sep;1996(67):3-16.
  10. Premkumar K, Coupal C. Rules of engagement–12 tips for successful use of “clickers” in the classroom. Med Teach. 2008 Jan;30(2):146-9.
  11. Thomas CM, Monturo C, Conroy K. Experiences of faculty and students using an audience response system in the classroom. Comput Inform Nurs. 2011 Jul;29(7):396-400.
  12. Cain J, Black EP, Rohr J. An audience response system strategy to improve student motivation, attention, and feedback. Am J Pharm Educ. 2009 Apr 7;73(2):21.
  13. Wood WB. Clickers: a teaching gimmick that works. Dev Cell. 2004 Dec;7(6):796-8.
  14. Kay RH, LeSage A. A strategic assessment of audience response systems used in higher education. Australasian Journal of Educational Technology. 2009 May;25(2):235-49.
  15. Mazur E. Peer instruction: a user’s manual. Upper Saddle River, NJ: Prentice Hall; 1997. 254 p.
  16. Crouch CH, Mazur E. Peer instruction: ten years of experience and results. Am J Phys. 2001 Sep;69(9):970-7.
  17. Cortright RN, Collins HL, DiCarlo SE. Peer instruction enhanced meaningful learning: ability to solve novel problems. Adv Physiol Educ. 2005 Jun;29(2):107-11.
  18. Sharma MD, Khachan J, Chan B, O’Byrne J. An investigation of the effectiveness of electronic classroom communication systems in large lecture classes. Australasian Journal of Educational Technology. 2005 Jun;21(2):137-54.
  19. Reay NW, Bao L, Li P, Warnakulasooriya R, Baugh G. Toward the effective use of voting machines in physics lectures. Am J Phys. 2005 Jun;73(6):554-8.
  20. Poulis J, Massen C, Robens E, Gilbert M. Physics lecturing with audience paced feedback. Am J Phys. 1998 May;66(5):439-41.
  21. Greer L, Heaney PJ. Real-time analysis of student comprehension: an assessment of electronic student response technology in an introductory earth science course. J Geosci Educ. 2004 Sep 1;52(4):345-51.
  22. Latessa R, Mouw D. Use of an audience response system to augment interactive learning. Fam Med. 2005 Jan;37(1):12-4.
  23. Caldwell JE. Clickers in the large classroom: current research and best-practice tips. CBE Life Sci Educ. 2007 Mar;6(1):9-20.
  24. Crossgrove K, Curran KL. Using clickers in nonmajors- and majors-level biology courses: student opinion, learning, and long-term retention of course material. CBE Life Sci Educ. 2008 Mar;7(1):146-54.
  25. FitzPatrick KA, Finn KE, Campisi J. Effect of personal response systems on student perception and academic performance in courses in a health sciences curriculum. Adv Physiol Educ. 2011 Sep;35(3):280-9.
  26. Levesque AA. Using clickers to facilitate development of problem-solving skills. CBE Life Sci Educ. 2011 Dec;10(4):406-17.
  27. Yourstone SA, Kraye HS, Albaum G. Classroom questioning with immediate electronic response: do clickers improve learning? Decis Sci J Innov Educ. 2008 Jan;6(1):75-88.
  28. Mayer RE, Stull A, DeLeeuw K, et al. Clickers in college classrooms: fostering learning with questioning methods in large lecture classes. Contemp Educ Psychol. 2009 Jan;34(1):51-7.
  29. Stoddard HA, Piquette CA. A controlled study of improvements in student exam performance with the use of an audience response system during medical school lectures. Acad Med. 2010 Oct;85(10 Suppl):37-40.
  30. Burnstein RA, Lederman LM. Using wireless keypads in lecture classes. Phys Teach. 2001 Jan 1;39(1):8-11.
  31. Scouller K. The influence of assessment method on students’ learning approaches: multiple choice question examination versus assignment essay. Higher Education. 1998 Jun 1;35(4):453-72.

Dr. Spors [fspors@westernu.edu] is an Associate Professor at Western University of Health Sciences College of Optometry. He teaches courses in optics and contact lenses for first-, second-, and third-year students. Dr. Spors is a fellow of the American Academy of Optometry and member of the Association for Research in Vision and Ophthalmology.

Dr. Tsang is an Associate Professor at Western University of Health Sciences College of Optometry. She teaches courses in contact lenses for second- and third-year students and sees patients at Western U Health. Dr. Tsang is a fellow of the American Academy of Optometry and member of the Association for Research in Vision and Ophthalmology.

Krystle Golly is experienced with educational technology and worked in administration at Western University of Health Sciences College of Optometry. She currently holds an administrative position with the County of San Bernardino in California.

Dr. Gray is an Assistant Professor at Western University of Health Sciences College of Optometry. He teaches courses in optics for first- and second-year students and sees patients at Western U Health. Dr. Gray is a member of the Association for Research in Vision and Ophthalmology.

Menu