OptomCAS | OPTOMETRY ADMISSION TEST     

Optometric Education

The Journal of the Association of Schools and Colleges of Optometry

Optometric Education: Volume 42 Number 2 (Winter-Spring 2017)

Student Performance and Perceptions Following Incorporation of Eyesi Indirect Simulators into the Optometric Curriculum

Heather A. Anderson OD, PhD, FAAO, Amber Gaume Giannoni OD, FAAO, and David A. Berntsen OD, PhD, FAAO

 

Abstract

Background: This study evaluates simulation for training binocular indirect ophthalmoscopy (BIO) skills. Methods: Students completed portions of Eyesi Indirect (VRmagic) courseware in fall 2014 prior to completing a BIO skills assessment for which performance was compared for three years prior (2011-2013) and during incorporation (2014) of simulation technology. Results: Scores did not differ across 2011-2013 (p = 0.153), but did differ when comparisons included 2014 (p = 0.032). The number of perfect scores in 2014 was greater than in years prior (p = 0.001). Conclusion: Incorporation of simulation technology had a positive impact on student preparedness to perform BIO on patients.

Key Words: binocular indirect ophthalmoscopy, simulation, optometric education, virtual reality, fundus examination

 

Background

Virtual reality patient simulators are an appealing tool in education programs for medical professionals. Simulators provide an opportunity for increased training in risk-free environments, which may result in better patient outcomes as trainees transition from the laboratory to actual clinical care. With respect to surgical simulators utilized in ophthalmological training, virtual patient simulators have been well-accepted by trainees,1 have been demonstrated to provide improvements in surgical outcomes in the operating room,2-4 and have even been validated for the assessment of clinical competencies for surgical certification.5

Image 1. A University of Houston College of Optometry student using the Eyesi Indirect simulator (VRmagic, Mannheim, Germany). Pictured are the touchscreen user interface, headband-mounted stereo display, handheld fundus lens and model patient face. Note that the student examiner views the patient images through the stereo display, while the touchscreen user interface simultaneously displays views for the on-looking instructor. Click to enlarge

Image 1. A University of Houston College of Optometry student using the Eyesi Indirect simulator (VRmagic, Mannheim, Germany). Pictured are the touchscreen user interface, headband-mounted stereo display, handheld fundus lens and model patient face. Note that the student examiner views the patient images through the stereo display, while the touchscreen user interface simultaneously displays views for the on-looking instructor.
Click to enlarge

In addition to ophthalmological surgical simulators, non-surgical, retinal diagnostic simulators are now available in a virtual reality platform to train eyecare practitioners.6 The Eyesi Indirect and Eyesi Direct systems (VRmagic, Mannheim, Germany) utilize augmented virtual reality, which incorporates virtual views of the patient face and fundus blended with actual views of the surrounding environment, such as the examiner’s hand, to create a realistic examiner experience for binocular indirect ophthalmoscopy (BIO) and direct ophthalmoscopy (DO). The system hardware includes a touchscreen user interface and either a headband-mounted stereo display with two handheld fundus lenses (Eyesi Indirect) or a handheld direct ophthalmoscope display (Eyesi Direct). A model patient face interacts with the examination equipment to render fundus images (Image 1). Both the indirect and direct ophthalmoscope platforms include a software curriculum that trains the mechanics of retinal examination, recognition of normal anatomic findings, and understanding of an extensive library of pathological findings rendered from fundus photographs of actual clinical patients. In addition to disease detection and recognition, the software aids in the development of the diagnostic process by incorporating common diagnostic tests, such as visual fields and OCT scans, and furthers the understanding of patient management through a variety of clinical questions that are built into each case scenario.

In the fall semester of 2014, the University of Houston College of Optometry (UHCO) became the first optometric institution in the United States to equip a clinical skills simulation lab for the training of both retinal examination and diagnosis with the Eyesi Indirect and Direct simulators. While the potential for improvements in student preparedness through simulation training is evident from previous studies of surgical simulators, published reports of the translation of student clinician skills from simulated to live patients for retinal diagnostic techniques are limited.7

This study seeks to provide a retrospective report of student performance on a pre-clinic, live-patient BIO skills assessment in the years prior to and during the implementation of the clinical skills simulation lab. In addition, anonymous feedback from surveys of faculty and current and incoming students is summarized to detail perceptions of the effect of implementing a clinical skills simulation lab in the UHCO curriculum on student performance.

Methods

This study was approved by the Committee for the Protection of Human Subjects at the University of Houston and was classified as exempt from informed consent due to the educational nature of the research. For this retrospective analysis, data were compiled from the UHCO second-year students’ BIO skills assessments (described below) for fall semesters 2011, 2012 and 2013 (pre-implementation of simulation technology) and compared to second-year students’ scores during fall semester 2014 (post-implementation of simulation technology). Student training methods for BIO examination at UHCO are described below for both pre- and post-implementation of simulation technology.

Traditional BIO instruction

Historically, students at UHCO are first introduced to BIO examination in the spring of their first year during the Clinic Practicum II Lab. Model eyes are utilized to develop an understanding of the mechanics of obtaining views and documenting abnormalities. In addition, students complete one lab session during which they examine classmates with dilated pupils as a cursory introduction to performing BIO on a live patient. All evaluations of BIO examination skills are conducted on model eyes during the first year.

In the fall semester of the second year, students receive BIO training in the Clinic Practicum III Lab, which consists of five instructor-led lab sessions during which students practice examining dilated classmates for up to one hour at a time. In addition, students may attend weekly two-hour ‘open-lab’ sessions throughout the semester to practice any basic clinical skills they desire, including BIO on classmates with dilated pupils. These extra sessions are not instructor-led; thus, documentation regarding who attended or how many hours were devoted to BIO during these open-lab sessions is not available.

Incorporation of simulation training

In the summer of 2014, five Eyesi Indirect and five Eyesi Direct platforms were installed at UHCO as part of the new clinical skills simulation lab (simlab) for retinal examination and diagnosis. The Eyesi platforms include a sequential courseware designed to educate students in: 1) obtaining fundus views and documenting findings (geometric shapes), 2) recognizing normal anatomical structures in the retina, 3) recognizing and managing common retinal pathologies, and 4) recognizing and managing pathology from advanced clinical cases. For the purposes of this study, only the students’ basic understanding of obtaining systematic views of the retina with BIO after completion of Eyesi Indirect Tier A (device handling and documentation) is assessed. This study does not investigate the clinical training benefits of the Eyesi Direct platform or the pathology portions of the software on either platform.

At the beginning of the 2014 fall semester, all enrolled second-year optometry students (n = 100) were assigned individual simlab accounts and received two hours of hands-on training from representatives of VRmagic to orient them to the proper use of the equipment. Students were then required to complete and earn passing scores on all cases in Tier A: Examination Skills of the software (version 1.4). At the time of assignment, Tier A consisted of nine subsections: A1 Device Handling (easy), A2 Device Handling (medium), A3 Device Handling (difficult), A4 Device Handling (small pupil), A5 Retina Screening, A6 Retina Documentation (easy), A7 Retina Documentation (medium), A8 Retina Documentation (difficult), A9 Retina Documentation (small pupil). For all subsections, the task included examination of the virtual patient’s fundus with the user having to ‘detect’ geometric shapes by obtaining a stable view with a virtual cross-hair positioned on top of the shape, or view, remember, and document observed geometric shapes by placing the appropriate shape with proper size, orientation and location on a touchscreen fundus map. The difficulty of cases increased with the inclusion of more shapes of smaller size and more peripheral retinal location. On subsection A5: Retina Screening, users were scored based on whether a thorough evaluation of the entire fundus was completed. Criteria for passing each case were set by the manufacturer. In total, 57 unique cases were included in the assignment.

Each student user’s performance scores, date of completion and time spent per case were stored locally on the simulator in use and then synchronized across all simulators each time the student logged out of the software via a web-based cloud system created by the manufacturer. The creation of the web-based cloud system allowed students to continue completion of their assignment on any available simulator throughout the semester. The data presented in this manuscript were extracted from the instructor web portal, which allows access to all synchronized data. A network cable failure early in the fall semester resulted in the loss of some data regarding time spent by users, which is acknowledged in the results, but student completion status was not lost (i.e., students did not have to repeat completed assignments).

Students were given the simulation assignment the first week of class in August 2014 as part of the course requirements for their Clinic Practicum III Lab and were required to complete the assignment by Dec. 4, 2014. Simlab access was restricted to students who were enrolled in the associated course and was available 24 hours a day, 7 days a week via swipe card. A student representative was assigned by the coursemaster to create an online scheduler for the weekday hours between 9 a.m. and 6 p.m., with each student being assigned three separate 1.5-hour blocks of time in the simlab over the course of the semester. However, all students had the option to come in on their own time on evenings or weekends to finish the assignment sooner. Although the assignment was not due until December, students were strongly encouraged to complete it earlier with the expectation that it would likely assist in their preparation for the clinical skills exams administered in November. In addition to the new simulation assignment, in the fall of 2014, second-year students completed the same traditional BIO laboratory instruction that had been given in previous years and had the same opportunities for open-lab practice sessions as in past years.

BIO skills assessment

Student BIO skills were evaluated in a skills assessment administered in the Clinic Practicum III Lab during the weeks of November 17 and December 1. The timing of test administration and the content of the exam were the same as previously administered in the fall semesters of 2011, 2012 and 2013. The live patient exam was graded by one of the four instructors assigned to each lab section (four lab sections total with some overlap in instructors). The assessment had well-defined criteria (Appendix 1) and was scored as a percentage of points earned out of 26 total points. While there were numerous instructors participating in grading over the years, it should be noted that one of the authors of this manuscript (AG) was the coursemaster for the lab and graded approximately 25% of the student class each year, and a second author of this manuscript (DB) was an instructor in one lab section and graded approximately 6% of the student class each year.

We hypothesized that test scores would not differ in the years prior to implementation of the simlab, but would improve with the addition of simulation training in 2014. Given that test scores were not normally distributed (all years were skewed towards high performance), we used the non-parametric Kruskal-Wallis test (the non-parametric equivalent of a one-way ANOVA) to compare test scores first for the years prior to implementation of simulation (2011-2013), and then for all years, including the year of simlab implementation (2011-2014). The percentage of perfect scores for each year was compared using chi-squared tests.

Assessment of perceptions: second-year student survey

On Nov. 7, 2014, an online survey link was sent to the second-year student class by the Clinic Practicum III Lab coursemaster, soliciting anonymous feedback regarding the new simlab. In the text of the e-mail, students were asked to complete the survey if they had finished at least half of the simulation assignment as of the date the survey was distributed. The survey consisted of three questions with five response options (strongly agree, agree, neutral, disagree, strongly disagree):

  1. Working through the BIO simlab curriculum has increased my understanding of how to systematically examine the retina and document my findings
  2. The opportunity to use the simlab has been an overall positive experience and a benefit to my optometric education
  3. Working through the BIO simlab curriculum has improved my ability to obtain BIO retinal views on actual student patients

Second-year clinic faculty survey

At the end of February 2015 (two months after the transition of second-year students to patient care in the University Eye Institute at UHCO), faculty who were clinical attending doctors in second-year clinic and had been for at least the past three years were sent an invitation by the Clinic Practicum III Lab coursemaster to complete an anonymous online survey to provide feedback about student performance and preparedness for second-year clinic. No mention was made in the survey invitation about the recent incorporation of simulation technology and/or its evaluation. Faculty responded to a series of questions, including two questions relevant to simulation technology:

  1. When thinking about your current OPT II students in the FPS (Family Practice Service) clinic, how do their BIO skills compare to prior OPT II students you’ve had AT THIS POINT IN THE SEMESTER? (better, the same, worse)
  2. When thinking about your current OPT II students in the FPS (Family Practice Service), how comfortable are they in performing BIO on patients compared to prior OPT II students you’ve had AT THIS POINT IN THE SEMESTER? (more comfortable, the same, less comfortable)

Incoming student survey

In February 2015, students who had already accepted admission to UHCO as part of the class of 2019 were invited to complete an online survey about their decision to attend UHCO. The survey was distributed via e-mail by a staff member in the Office of Optometry Relations. Students responded to a series of questions, including two questions relevant to simulation technology:

  1. What factors played into your decision to attend UHCO? (select as many as are applicable: UHCO’s facilities, UHCO’s Vision Source Surgery Center, Procedures Lab with video slit lamps, Clinical Skills Simulation Lab, College Location, Faculty Expertise, Interview Day Experience, or Other)
  2. How much did the clinical skills simulation lab influence your decision? (a lot, some, a little, none)

Responses to the three surveys were summarized by percentages of respondents in order to obtain information regarding perceptions of the simlab. No formal analysis was conducted on survey data.

Results

Simulation completion times

Student completion times for the components of Tier A in the Eyesi Indirect courseware are summarized in Table 1. Although all 100 students completed the entire simulation assignment, network connectivity issues early in the deployment of the technology resulted in loss of storage of completion time data for some students. Therefore, total completion times for all individual students are unavailable. Based on the median completion times for each tier from the available data, the total median completion time for Tiers A1-A9 was approximately 8.5 hours. However, completion times varied widely across students, as can be seen in the minimum and maximum columns in Table 1.

Student performance

Inspection of student completion dates for the simulation assignment demonstrated that 88% of the class in fall 2014 completed the entire assignment (Tiers A1-A9) by the time of their BIO skills assessment. The remaining 12% had completed, on average, 78% of the assignment by the time of their skills assessment (completion ranged from Tier A5 to A8 for these 12 students). Class performance on the fall semester second-year BIO skills assessment was not normally distributed and consistently skewed towards high performance (Table 1, Figure 1), creating a potential ceiling effect for detection of improvements in test scores following the introduction of simulation technology. Even in light of this constraint, the distribution of test scores when comparing years 2011-2014 were significantly different (Kruskal-Wallis, p = 0.032) vs. a comparison of years 2011-2013 (before simlab implementation), which did not significantly differ (Kruskal-Wallis, p = 0.153), supporting our hypothesis that performance was stable for the years prior to simulation, but improved with the addition of simulation training in 2014. Additionally, the number of perfect scores (100%) earned in the year 2014 was significantly greater than in the three years prior (χ2, p = 0.001), whereas no difference in the number of perfect scores was observed across the three years prior to simulation technology (χ2, p = 0.274) (Table 2).

Student perceptions

Student responses to an anonymous web-based survey distributed to the 100 second-year students during the fall 2014 semester are shown in Figure 2. Students were asked to complete the survey if they had finished at least half of the simulation assignment as of the date the survey was distributed (Nov. 7, 2014). Web portal information indicates that 93 students had completed at least Tiers A1-A5 by that date, and thus if only those students responded, the response rate for the survey was 88%. Of those responding, more than 90% agreed or strongly agreed that the simlab curriculum had increased their understanding of performing BIO (Figure 2A) and that it was a positive overall experience (Figure 2B). More than three-fourths of the respondents agreed or strongly agreed that the simlab curriculum had improved their ability to obtain views on student patients (Figure 2C).

Faculty perceptions

Sixteen faculty responded to the survey about student performance in the second-year clinic during spring 2015. Of the 16 respondents, 71% felt students were better prepared on basic BIO examination skills when first entering clinic in 2015 than in past years. Also, 100% felt that students were more comfortable performing BIO on patients in 2015 than in past years.

Incoming student perceptions

Sixty-one students responded to the survey of incoming students regarding their decision to attend UHCO. When asked to select as many factors as applicable that influenced their decision to choose UHCO, 80% of respondents indicated that the clinical skills simulation lab was a factor. Regarding the magnitude of the influence of the simlab on their decision to attend UHCO, 22% rated its influence as a lot, 51% some, 20% a little, and 4% none.

Discussion

This study provides evidence that incorporation of simulation training with the Eyesi Indirect positively influenced the performance of students in the second-year class on their lab BIO skills assessment, as indicated by improved test scores. The study also indicates that incorporation of the simulation training positively influenced their performance in the clinic when seeing patients for the first time, as reported anonymously by second-year clinical faculty. The creation of a clinical skills simulation lab was also viewed positively by students currently utilizing the lab as well as by incoming students who had yet to utilize the technology.

BIO is often a challenging technique to master mechanically, as well as mentally, due to the need to comprehend the image reversal created by the fundus lens. That being said, BIO is a technique that was previously being taught with high success at UHCO, as evidenced by the test scores in Figure 1. Despite a potential ceiling effect of already strong scores limiting the room for improvement, students in fall 2014 still performed statistically better than students in preceding years. The authors acknowledge that having a second year of test scores from 2015 to demonstrate a continued stronger performance with incorporation of simulation would be compelling; however, these data are unavailable due to changes in both the timing of simulation assignments in the curriculum and resultant changes to the BIO skills assessment, as follows. In the spring of 2014, due to a desire to reduce the simlab load for future second-year students, the first-year students were assigned portions of Tier A to complete during March and April (A1, A2, A6, A7). This change in curriculum meant that students entered the second-year curriculum with greater exposure to BIO than the second-year students reported in this study. Early in the fall semester of 2015, the Clinic Practicum III coursemaster recognized that the new second-year students were much better prepared for BIO examination than in years past, and deemed the traditional BIO skills assessment (reported in this study) as ‘too easy’. As a result, the assessment was altered to include full views in nine peripheral locations to better challenge and further assess students’ ability to perform BIO, leaving us unable to compare performance between fall 2014 and 2015.

Incorporation of simulation technology in the fall of 2014 resulted in a median simulation usage time of 8.5 hours, which was in addition to the traditional instructor-led practice time of approximately 5 hours. With a class of 100 students, that increase would represent an additional 850 hours of participation from dilated patients if it were conducted on classmates rather than the simulators. Not only did the incorporation of simulation technology offer more practice time for students without the concern of fatiguing their classmates, it redirected the emphasis of BIO education away from a set number of hours of instruction per student to a set endpoint of competency, irrespective of the time it took to reach that goal. By standardizing the endpoint and requiring all students to pass and complete Tier A of the simulation courseware, students were able to work at their own pace instead of achieving the best outcome they could with a fixed set of hours. This shift in educational strategy may be attested to by the fact that more students in the fall of 2014 attained perfect scores on their skills assessment.

While this study offers encouraging findings in support of simulation technology, it is not without its limitations. First, because this study was a retrospective evaluation across years rather than a randomized study of simulation versus non-simulation training, the training experiences of students could have varied from year to year. This possibility is limited, however, given that the same instructor served as coursemaster across all four years tested. In addition, student performance on the BIO skills assessment did not differ significantly across 2011-2013, suggesting that student instruction was likely uniform for the years prior to implementation of simulation.

A second limitation is the fact that two authors of this paper served as instructors scoring the BIO skills assessments each year. The potential for bias toward an improvement with simulation training exists given that the authors led the initiative to implement the simlab. However, the grading criteria of the BIO skills assessment were well-defined with binary outcomes rather than a subjective scoring scale (Appendix 1), and the majority of the skills assessments were scored by instructors who had no connection to the simlab, which would limit the potential for bias.

Lastly, the results of this study are limited to a discussion of the potential benefits of completion of Tier A in the Eyesi Indirect courseware. Tier A is designed to train basic examination skills (obtaining views and documenting findings) and does not address the anatomical structures of the retina or pathological findings. Further studies are needed to evaluate the potential training benefits of completion of the other portions of the courseware and its impact on clinical performance or standardized license examinations.

Conclusions

The incorporation of simulation technology into the BIO training curriculum at the University of Houston College of Optometry had a positive impact on the preparedness of second-year students to systematically obtain views of the retina of live patients in both the classroom and the clinic.

Disclosure

The authors of this manuscript have no conflicts of interest related to the Eyesi Indirect simulator.

References

  1. Koch F, Koss MJ, Singh P, Naser H. Virtuelle realität in der ophthalmologie. Klin Monatsbl Augenheilkd. 2009;226:672-6.
  2. McCannel CA, Reed DC, Goldman DR. Ophthalmic surgery simulator training improves resident performance of capsulorhexis in the operating room. Ophthalmology. 2013;120:2456-61.
  3. Pokroy R, Du E, Alzaga A, Khodadadeh S, Steen D, Bachynski B, Edwards P. Impact of simulator training on resident cataract surgery. Graefes Arch Clin Exp Ophthalmol. 2013;251:777-81.
  4. Deuchler S, Wagner C, Singh P, et al. Clinical efficacy of simulated vitreoretinal surgery to prepare surgeons for the upcoming intervention in the operating room. PLoS One. 2016;11:e0150690.
  5. Thomsen AS, Kiilgaard JF, Kjaerbo H, la Cour M, Konge L. Simulation-based certification for cataract surgery. Acta Ophthalmol. 2015;93(5):416-21.
  6. Schuppe O, Wagner C, Koch F, Manner R. EYESi ophthalmoscope – a simulator for indirect ophthalmoscopic examinations. Stud Health Technol Inform. 2009;142:295-300.
  7. Singh P, Deuchler S, Schaefer H, Fassbender S, Kohnen T, Koch FHJ. Indirect ophthalmoscopy: training with conventional hardware versus the Eyesi indirect ophthalmoscopy simulator. Invest Ophthalmol Vis Sci. 2014;55:E-abstract 279.
Appendix A: Click to enlarge

Appendix A: Click to enlarge

 

Dr. Anderson [handerson@central.uh.edu] is an Associate Professor at the University of Houston College of Optometry and Co-Coursemaster of the Clinic Practicum I & II Laboratory Sequence.

Dr. Gaume Giannoni is a Clinical Professor at the University of Houston College of Optometry and serves as Coursemaster of the Clinic Practicum III Laboratory and Co-Coursemaster of the Opt II Clinic.

Dr. Berntsen is an Associate Professor at the University of Houston College of Optometry and Co-Coursemaster of the Clinic Practicum I & II Laboratory Sequence.

Menu Title