OptomCAS | OPTOMETRY ADMISSION TEST     

Optometric Education

The Journal of the Association of Schools and Colleges of Optometry

Optometric Education: Volume 44 Number 3 (Summer 2019)

PEER REVIEWED

Interactive Multimedia Learning vs. Traditional Learning in Optometry: a Randomized Trial, B-scan Example

Elena Z. Biffi, OD, MS, FAAO, and Misty Woodbury, MA

Abstract

Interactive multimedia learning can potentially enhance procedural training in optometric education. This study compared the effectiveness of an interactive learning module vs. traditional learning in preparation for in-person B-scan procedural training. Investigators randomized 132 third-year optometry students at a single institution to receive either the newly developed multimedia module (intervention group, n = 65) or traditional preparation material (control group, n = 67). Results demonstrated higher learner satisfaction, greater self-reported knowledge and better performance on standardized multiple-choice testing (both before and after in-person training) with interactive multimedia learning. The findings highlight the need to apply learning science to the development of optometric instruction.

Key Words: optometry, education, B-scan ultrasound, interactive multimedia learning

Introduction

With the advancement of technology and increased use of electronic devices, interactive multimedia learning has been a point of interest in medical education.1,2 Interactive multimedia learning is defined as online instruction that combines multimedia formats (text, video, audio, images) with activities that help the learner apply and receive feedback on their understanding.3,4 Whereas traditional learning is defined, for the purposes of this study, as content delivered in the format of a textbook and/or PowerPoint slides without an interactive assessment component.

Figure 1. Adapted from Multimedia Learning, 2nd edition, Mayer (2009).
Click to enlarge

Several studies have associated greater learning efficiency, problem-solving abilities and satisfaction rates with interactive multimedia online lectures.5,6,7 Schneider et al. reported an average of 20% better test scores for students who used computer-based patient cases vs. students who used textbooks to prepare for testing on diagnoses in urology.8 Another study showed test scores improved as much as 46% for students who used a multimedia e-book vs. students who used a traditional PowerPoint lecture format for learning blood-cell morphology.9 Similarly, researchers at the University of Manchester found “a significant increase (p ≤ 0.01) in the mean examination scores” after multimedia online modules were introduced in an anatomy course for first-year optometry students.10 Based on a study by Issa et al. Third-year medical school students who used a multimedia lecture to study the topic of shock significantly outperformed students who used traditional PowerPoint lectures on both immediate and delayed knowledge retention and transfer.11 Moreover, research shows that “blended learning environments” (those with an online component) not only improve learning outcomes but also are preferred by students.12

To improve information comprehension, Effective Use of Educational Technology in Medical Education (2007) from the Association of American Medical Colleges recommends the use of Mayer’s multimedia principles in designing effective educational material for students.13,14 Mayer’s cognitive theory of multimedia learning explains how learners process multimedia information and how to better design learning materials to maximize learning gains. His theory builds on cognitive load theory, working memory theory and dual-processing theory (Table 1) to explain how learners process, select, organize and integrate words and images with prior knowledge (Figure 1).3,16 According to Mayer, when learners process multiple sources of visual information (e.g., text, photos, illustrations, video) their attention becomes “split,” adding to cognitive load and making it more difficult to process the information.17 Therefore, Mayer has developed a framework for creating multimedia learning materials that helps learners more effectively process information. This framework includes the integration of 12 key empirically based design principles (Table 2).3

The combination of multimedia learning and interactivity in the form of self-assessment has the potential to enhance the effectiveness of course preparation practices in optometry. Research shows that the incorporation of an interactive component further improves student learning and retention, and that students have better learning outcomes and course satisfaction when immediate feedback is given.4 Interactivity should include more than a learner navigating through a learning module or pointing and clicking hyperlinks and images. To be effective, interactivity must engage the learner in applying knowledge and offer ongoing feedback.4

Many colleges now incorporate online components in their traditional, face-to-face courses. Although a significant body of research on the effectiveness of interactive multimedia learning has been published, little has been reported with regard to clinical procedures, specifically instruction on optometric procedures. The purpose of this study was to design an interactive multimedia teaching module following Mayer’s multimedia principles (Tables 1 and 2) and test its effectiveness with a procedural optometric topic, B-scan ultrasound. The investigators proposed that using interactive multimedia modules to prepare for the procedural B-scan ultrasound topic, as opposed to traditional approaches such as reading a text or viewing a slide show, would result in:

  1. Better preparedness for the hands-on in-person training
  2. Improved pre- and post-lab session testing scores
  3. Improved self-estimation of fund of knowledge and competence after a procedural teaching course
  4. Higher learner satisfaction rates after completion of a procedural training course

Methods

Study duration and subject visits

This study was conducted in the summer 2015 semester at the New England College of Optometry (NECO). The subjects were NECO OD3 students (OD2017 graduating class) taking the Advanced Diagnostic Techniques (ADT) 1 course at study inception. Students in the first lab training session (summer 1) served as the control group. Students in the second lab training session (summer 2) served as the intervention group. Randomization relied exclusively on session assignment. No other randomization parameters, e.g., prior knowledge of B-scan, were used. This randomization scheme ensured that the control group could not provide information about test questions or answers to the intervention group. (The summer 2 session takes place after the summer 1 session.)

The only exclusion criterion for participation in the study was self-reported inability to use the interactive multimedia module due to physical or mental disability, as tested in question 1 of an initial 10-question online survey (Appendix I). Prior familiarity with the B-scan topic was captured via testing as described below and adjusted for. Summary (not individual) information on prevalence of attention/learning disorders and disabilities was obtained for comparison purposes, thus ensuring comparability of intervention and control subjects.

Initially, all participants took the 10-question online survey (Appendix I), which was designed to quantify familiarity with the B-scan topic prior to exposure to any course material. All surveys were completed by one week prior to study inception as set in the study timeline (Appendix II). One week prior to the start of the B-scan lab training, the course preparation was provided online as either traditional reading material from a textbook chapter18 (for control group) or as an online interactive multimedia module (for intervention group). No additional equipment was necessary. Course preparation was carried out by students on a personally defined time schedule, with no restrictions in terms of location (i.e., no dedicated facilities required). On the basis of automated reports from the course management software platform employed at NECO, all students in the study successfully completed lab preparation material independently. A pre-lab quiz and questionnaire were administered to each training group at the beginning of the B-scan lab training session. After the initial testing was complete, subjects underwent the in-person hands-on B-scan ultrasound lab portion of the ADT 1 course, conducted by the same instructor. A post-lab quiz and questionnaire were administered one week after the B-scan lab training in multiple sessions to accommodate students’ class schedules and other exams.

Testing materials

Online survey

All study participants completed a 10-question online survey (Appendix I), which included one self-screen question about ability to use the interactive multimedia module; one self-report question on familiarity with B-scan; and eight questions for quantifying B-scan topic familiarity (physics, indications, contraindications, types of B-scans, image interpretation) prior to exposure to any course material.

Pre-lab quiz/self-report questionnaire

Students took an initial pre-lab quiz and completed a self-report fund of knowledge/competence questionnaire at the beginning of their B-scan lab training session. The pre-lab quiz consisted of 25 multiple-choice questions (Appendix III). The tested topics included B-scan physics (two questions), indications (three questions), contraindications (one question), procedural knowledge (five questions) and image interpretation (14 questions). The questionnaire consisted of five self-report questions about knowledge/competence and satisfaction (Appendix IV).

Post-lab quiz/self-report questionnaire

After one week of B-scan lab, all students took a post-lab quiz and filled out a post-lab self-report fund of knowledge/competence questionnaire. The post-lab quiz consisted of the same 25 multiple-choice questions as the pre-lab quiz. To prevent test-memorization confounding, the question sequence and answer coding were randomized in the post-lab quiz. The self-report fund of knowledge/competence and satisfaction questionnaire remained the same but was tailored to evaluation of the overall course (Appendix V).

Interactive multimedia module

Traditional course preparation material (for control group) consisted of reading material from a book chapter, scanned and copied to be available online. The interactive multimedia module (for intervention group) followed Mayer’s cognitive theory of multimedia learning design principles (Table 2).3 The interactive multimedia module was created with an application that delivered content in a mix of media including text with large-font letters and deleted non-related material; images with cues (highlights, arrows, etc); a conversational-style voice presentation with superimposed contiguous graphs and images; and videos of the B-scan procedure with narration to highlight the most important points (Figure 2). Brief interactive assessments were also included at the end of each topic (every 5-7 minutes of projected presentation time).

Figure 2. Screenshot from the interactive multimedia module showing content delivered in a mix of media including text with large-font letters, image of the B-scan with cues (retinal location) and voice. A conversational-sounding voice accompanied superimposed contiguous text and images.
Click to enlarge

All content was presented in learner-paced segments, and interactivity was incorporated in the form of 2- to 3-question assessments that tested the learner’s ability to apply the new knowledge by answering various types of questions. For incorrect answers, remediation (links to the related module content for review) was offered. Students had two opportunities to answer each question correctly. After a second incorrect answer, in-depth explanations of correct and incorrect answers were provided. Students answering the questions correctly also had access to all in-depth explanations for further review.

Statistical methods

Data collection (chi-square test, Fisher’s exact test, Student’s t-test and Wilcoxon rank-sum test) was successfully completed by the end of August 2015. Data were entered into Excel to be analyzed using univariable testing procedures. Multivariable analyses utilized linear regression models to identify associations between group assignment and outcomes of interest. Study endpoints were: 1) pre- and post-topic in-person lab training test scores, difference between intervention and control group; 2) pre- and post-lab training self-perceived knowledge and competence, difference between intervention and control group; and 3) satisfaction with the lab teaching process, difference between intervention and control group.

The NECO Institutional Review Board approved the study.

Results

A total of 133 students were enrolled in the study. They were randomized by session membership as previously described in an intervention group (n = 65, 49.2%) that received the interactive multimedia module and a control group that received traditional review materials (n = 67, 50.8%). Based on self-reported inability to utilize the interactive multimedia module, no students were excluded from the study. Table 3 shows the baseline demographic and academic data. Intervention group students had a lower average GPA than control group students (2.68 vs. 3.00, p = 0.039) but did not differ in self-reported topic familiarity (p = 0.30).

In univariable analyses, exposure to the interactive multimedia module resulted in significantly improved performance on standardized testing prior to the in-person lab training. The pre-lab quiz median score was 17/25 (68%) in the intervention group vs. 12/25 (48%) in the control group (p < 0.001), representing a 20% improvement on material transfer in the intervention group. The gap between the two groups notably narrowed after the in-person lab training. The post-lab quiz median score was 20/25 (80%) in the intervention group vs. 19/25 (72%) in the control group, p = 0.001). However, even after the in-person lab training, the intervention group outperformed the control group and had an 8% improvement in knowledge retention (Figure 3 and Table 4). Multivariable analyses adjusted for student characteristics (gender, GPA, self-reported B-scan topic familiarity) confirmed statistically significant differences in testing scores both before and after in-person lab training, as shown in Table 4.

According to univariable analyses, students in the intervention group were more satisfied with the course preparation material both before and after the lab (both p < 0.001) and felt that both the preparation material and the in-person training covered the topic adequately (both p < 0.01). The largest difference between the groups, seen on the pre-lab questionnaire, was on the question of whether the B-scan prep material covered all course-required topics adequately and in-depth. For this question, the control group reported an average satisfaction rate of 3/5, while the intervention group average was 5/5 (p < 0.001). Also notable among the univariable comparisons were, for the intervention group, higher self-reported knowledge level and confidence in ability to perform the B-scan independently before the in-person lab (both p < 0.025). Detailed results of multivariable analyses adjusted for student characteristics (gender, GPA, self-reported B-scan topic familiarity) are presented in Table 4. After adjustment, students exposed to the interactive multimedia module reported greater overall satisfaction with the course (both before and after in-person training) and were more confident in self-reported knowledge and ability to perform procedures independently prior to the in-person training.

Discussion

This study compared two learning methods used in preparation for in-person hands-on lab training in B-scan ultrasound, a clinical procedure commonly used by optometrists and ophthalmologists. As course preparation material for the in-person lab training, the control group of students received conventional preparation material that consisted of reading material from a textbook chapter.18 The intervention group received an online interactive multimedia module that followed Mayer’s multimedia principles (Table 1 and Table 2).3 The module was based on the same information from the same book chapter given to the control group. The main study outcomes focused on comparisons of the groups’ standardized test performance, self-reported topic proficiency/familiarity and learner satisfaction. The study results revealed that use of the interactive multimedia module resulted in a 20% improvement in material transfer. In addition, knowledge retention was improved by 8% for the intervention group in comparison to the control group (80% vs. 72%), who received conventional preparation material.

Results also showed that hands-on in-person lab training significantly improved material understanding for both the intervention and control groups. Furthermore, analyses identified a narrowing of the gap in standardized testing performance between the control and intervention groups after in-person lab training. However, students in the intervention group continued to outperform students in the control group to a statistically significant degree. The two student groups enrolled in the study were well-matched for total number of participants, gender and initial self-reported B-scan topic familiarity. However, the intervention group had a lower average GPA than the control group. While this represents a limitation of the matching procedures, it does not increase risk of the study generating false positive results.

After likely underestimation of benefit from deployment of the interactive multimedia module was accounted for, the study findings underscored potential long-term benefits of the multimedia approach. Furthermore, the data demonstrated that intervention group students were more satisfied with material coverage than control group students even though both groups received the same information prior to the hands-on training. These additional benefits (besides improved transfer of teaching material) represented a notable strength for the interactive multimedia approach, as they may have impacted learners’ approach to in-person lab training. It is worth mentioning that the intervention group students were more satisfied with the material provided, despite demonstrating superior self-perception of knowledge prior to the hands-on in-person training as well as after the training. It may therefore be summarized that the interactive multimedia learning strategy provided a clearer assessment of students’ familiarity with the topic, without generating undue distress or anxiety about future teaching activities (as reflected by greater satisfaction).

The study findings support Mayer’s theory that interactive multimedia learning materials enhance the learning process by providing multiple representations of content as well as consistent, formative feedback.3,4 Additionally, the findings are consistent with a previous report of improved test scores for students who used computer-based patient cases rather than textbooks for diagnoses in urology.8 Moreover, the interactive component of the module, which allowed learners to test their understanding without negative consequences, supports the idea of a learner-centered approach for improving learning outcomes and learner satisfaction.4

Many instructors are at ease creating learning materials that incorporate audio and visual components, but the complexity of combining so many different elements can sometimes prove ineffective and even detrimental to learning if the designs are focused solely on incorporating the technology and not grounded in learning research and theory. As such, the development of effective online learning material that includes interactive multimedia may require a multidisciplinary team and can be time- and resource-intensive.19, 20

The study reported here has some limitations. For example, it was not possible to measure the time each student took to prepare for the B-scan hands-on lab training or whether time spent on lab preparation could influence the overall knowledge acquisition outcome. Overall, the relatively large number of students who participated in the study is likely to have outweighed individual deviations from average preparation practices, lessening the impact of this limitation. Ultimately, the randomized nature of the utilized approach implies that unaccounted effects in the intervention group (e.g., greater time spent in preparation) are likely to be related to the intervention itself (e.g., more time spent in preparation due to more accurate self-perception of familiarity), rather than related to chance alone. Better capture of time allocated to self-preparation would have allowed better comparison of the intervention and control groups.

Another potential limitation of the study is the timing of study procedures. The control group underwent preparation and in-person training first (session 1 in the summer period), followed by the intervention group (session 2 in the summer period). It is therefore conceivable that test questions could be verbally shared by students in the control group with students in the intervention group, resulting in systematic bias in testing outcomes. Therefore, simultaneous test preparation and in-person training for the two groups would be preferable in future studies. Additional studies looking at material retention and competence maintenance long-term (weeks to months) after in-person training are warranted to fully explore the potential benefits of interactive multimedia learning in optometric education.

Conclusion

This study reveals the effectiveness of using an interactive multimedia module to prepare for in-person training in a clinical procedure commonly used by optometrists and ophthalmologists, B-scan ultrasound. The use of the interactive multimedia module resulted in a 20% improvement in material transfer, 8% improvement in knowledge retention and significantly improved satisfaction rates with the course preparation.

Further, this study provides a clear method for creating tangible learning gains through the integration of educational technology and supports current learning theory. It is recommended that design practices based on the work of Mayer be used in order to avoid cognitive overload.3 It is also recommended that an interactive component that helps learners apply and get feedback on their understanding be incorporated.

References

  1. George PP, Papachristou N, Belisario JM, et al. Online eLearning for undergraduates in health professions: a systematic review of the impact on knowledge, skills, attitudes and satisfaction. J Glob Health. 2014 Jun;4(1):010406.
  2. Choules AP. The use of elearning in medical education: a review of the current situation. Postgrad Med J. 2007 Apr;83(978):212-6.
  3. Mayer RE. Multimedia Learning. Cambridge University Press. 2009.
  4. Cairncross S, Mannion M. Interactive multimedia and learning: realizing the benefits. Innovations in Education and Teaching International. 2001;38(2):156-164.
  5. Davids MR, Chikte UM, Halperin ML. Development and evaluation of a multimedia e-learning resource for electrolyte and acid-base disorders. Adv Physiol Educ. 2011 Sep;35(3):295-306.
  6. Duncan SF, Hendawi TK, Sperling J, Kakinoki R, Hartsock L. iPhone and iPad use in orthopedic surgery. Ochsner J. 2015 Spring;15(1):52-7.
  7. Lee LA, Chao YP, Huang CG, et al. Cognitive style and mobile e-learning in emergent otorhinolaryngology-head and neck surgery disorders for millennial undergraduate medical students: randomized controlled trial. J Med Internet Res. 2018 Feb 13;20(2):e56.
  8. Schneider AT, Albers P, Müller-Mattheis V. E-learning in urology: implementation of the learning and teaching platform CASUS® – Do virtual patients lead to improved learning outcomes? A randomized study among students. Urol Int. 2015;94(4):412-8.
  9. Hsiao CC, Tiao M, Chen CC. Using interactive multimedia e-books for learning blood cell morphology in pediatric hematology. BMC Med Educ.2016 Nov 14;16(1):290.
  10. Choudhury B, Gouldsborough I, Gabriel S. Use of interactive sessions and e‐learning in teaching anatomy to first‐year optometry students. Anat Sci Educ. 2010 Jan-Feb;3(1):39-45.
  11. Issa N, Mayer RE, Schuller M, Wang E, Shapiro M, DaRosa DA. Teaching for understanding in medical classrooms using multimedia design principles. Med Educ. 2013;47(4):388-96.
  12. Kiviniemi MT. Effects of a blended learning approach on student outcomes in a graduate-level public health course. BMC Med Educ. 2014 March 11;14:47.
  13. DiGiacinto D. Using multimedia effectively in the teaching-learning process. J Allied Health. 2007 Fall;36(3):176-9.
  14. Effective Use of Educational Technology in Medical Education. Colloquium on Educational Technology: Recommendations and Guidelines for Medical Educators. Association of American Colleges. March 2007; 9.
  15. Clark JM, Paivio A. Dual coding theory and education. Educ Psychol Rev. 1991 Sept;3(3):149-210.
  16. Chandler P, Sweller J. Cognitive load theory and the format of instruction. Cogn Instr. 1991;8(4):293-332.
  17. Mayer RE, Moreno R. A split-attention effect in multimedia learning: evidence for dual processing systems in working memory. Journal of Educational Psychology. 1998;90(2):312-320.
  18. Casser L, Fingeret M, Woodcome HT. Atlas of Primary Eyecare Procedures. Stamford: Appleton & Lange, 1997: Section 66 B-Scan Ultrasound, 268-271.
  19. Akl EA, Mustafa R, Slomka T, Alawneh A, Vedavalli A, Schünemann HJ. An educational game for teaching clinical practice guidelines to internal medicine residents: development, feasibility and acceptability. BMC Med Educ. 2008 Nov 18;8:50.
  20. Topaz M, Rao A, Masterson Creber R, Bowles KH. Educating clinicians on new elements incorporated into the electronic health record: theories, evidence, and one educational project. Comput Inform Nurs. 2013;31(8):375-9.

Appendix I. Click to enlarge

Appendix II. Click to enlarge

Appendix III.
Click to enlarge

Appendix IV. Click to enlarge

Appendix V. Click to enlarge

 

Dr. Biffi [BiffiE@neco.edu] is an Assistant Professor of optometry at New England College of Optometry (NECO) and an Attending Optometrist at NECO Center for Eye Care/South Boston Community Health Center Eye Clinic. She is the instructor of record for the Clinical Ocular Imaging Topics and Advanced Procedures (formerly known as Advanced Diagnostic Techniques) courses. Both courses are offered in the third year of optometry education at NECO.

Misty Woodbury is the Director of Academic Technology and Innovation Group at Emmanuel College (Boston). Her work includes creating and facilitating faculty development workshops, developing and reviewing online courses, and implementing innovative uses of academic technology and pedagogical practices throughout the college.

Menu