OptomCAS | OPTOMETRY ADMISSION TEST     

Optometric Education

The Journal of the Association of Schools and Colleges of Optometry

Optometric Education: Volume 42 Number 3 (Summer 2017)

Communicating Educational Objectives
in an Optometry Course

Lawrence R. Stark, B.App.Sc. (Optom.)(Hons), PhD

 

Abstract

This study investigates how to communicate written course objectives effectively within an optometry course. An action research approach was used to study how students use and interpret behavioral objectives, and how they use course materials. Students use objectives in a wide variety of ways, some of which are consistent with past studies and with the cognitive mediation paradigm. Objectives and self-tutorials made learning easier and more efficient and provided appropriate expectations of examination questions.

Key Words: behavioral objectives, course objectives, cognitive processes, cognitive mediation paradigm, formative evaluation

 

Background

Educational context

The purpose of this study is to investigate how to communicate written course objectives within the course Visual Optics so students can be informed effectively about what they should learn. This course is offered in the first year of the four-year optometry program at the Southern California College of Optometry at Marshall B. Ketchum University (SCCO). Resources include a course handbook and self-tutorial exercises. The course handbook contains lecture and laboratory class notes set out in chapters. Each handbook chapter begins with an advance organizer, which is a type of introduction that aims to bridge the gap between what the students may already know and what they will learn.1, 2 This is followed by the behavioral objectives3, 4 listed in a nested form, which consists of a small number of general objectives, each with a subset of specific learning outcomes constructed using Mager’s3 behavioral-objective format (Gronlund,5 ch. 2). Each self-tutorial consists of the behavioral objectives set out as headings, with relevant questions listed under each heading, and a list of answers to provide feedback to the student. My reflections on the current implementation led to a set of 10 inquiries, six of which are presented in the current paper.6

INQUIRIES

Inquiry 1. How do students actually use objectives?

Despite the large volume of papers published on the topic of behavioral objectives, surprisingly few authors have considered whether students actually use the supplied objectives.7–11 Unfortunately, the conclusions of those studies differ widely. Several authors have studied the specific strategies of those students who do make some use of supplied objectives (Table 1).9, 12, 13 Of these, only Mast et al.’s study involved a genuine educational context; that is, they studied actual student behaviors in a real educational program rather than having subjects (or students) participate in an experimental situation designed by the investigators.13 In addition to these observational studies, Jiang and Elen11 hypothesized a three-part cognitive mediation paradigm, in which the student (1) interprets the objective, (2) uses the objective for goal-directed learning, and (3) self-tests to the objective. In relation to the current course, it would be useful to know how students use the objectives (if at all), and whether the students’ strategies match those of previous studies (Table 1) and the cognitive mediation paradigm.11

Inquiry 2. Do students have previous experience in the use of objectives?

My literature search uncovered no direct studies of the effect of prior experience with objectives on students’ current use of objectives. In relation to the current course, students who have used behavioral objectives in the past might perform better with objectives than students who lack this experience.

Inquiry 3. Are there ways to present objectives to enhance learning?

The current handbook has the behavioral objectives placed near the start of each chapter. It is natural to question whether other placements could be more helpful for students. The literature indicates that interspersing objectives within the passage before each paragraph leads to significantly higher test performance than other single placement alternatives.14, 15 In addition, Kaplan found greater learning when objectives were placed before and after a passage than when they were placed in either location alone.16 Contrary to expectations, a larger number of objectives is not a deterrent to students’ use of objectives,13 nor does it have a significant effect on learning.17

Inquiry 4. Does completion of questions promote learning?

Several studies show an important effect of practice with feedback on test performance in objectives-based curricula,11, 18–22 consistent with the self-testing component of the cognitive mediation paradigm.11 These beneficial effects of practice, unfortunately, are reduced greatly when students are required to transfer knowledge to unfamiliar situations.22 In relation to the current course, it would be helpful to know if completion of the self-tutorials provides appropriate opportunities for practice and feedback.

Inquiry 5. Do objectives increase the ease and efficiency of discovering what should be learned?

Rushin and Ballin calculated study efficiency as the ratio of test performance to study time, in points per hour.23 Using these measures, undergraduate students provided with objectives were significantly more efficient than those without objectives. In Mast et al.’s study,13 medical students stated that shortage of time was a reason for using objectives, and that objectives improved the efficiency of their study time. In contrast, two other studies did not find a replicable effect of objectives on students’ reports of knowing what they should learn.18, 19

Inquiry 6. Do exam questions meet students’ prior expectations of learning?

Little is known as to whether objectives provide students with accurate expectations of test content. Medical students reported using objectives less when they found that the objectives were not being tested.13 Since providing students with appropriate expectations is cited as an important reason for using behavioral objectives,24 it was important to know if the objectives in the current course were assisting students in this respect.

Methods

A survey was designed to address the six inquiries of this study. The aims of this survey were:

  1. To determine how students use objectives in the course, and whether published strategies are representative of actual uses (inquiry 1). Eighteen strategies from three papers9, 12, 13 were presented as Likert items (Table 1). Participants were also asked to contribute their own strategies in an open-ended question.
  2. To document the level of previous experience with objectives (inquiry 2). Participants were asked to estimate the percentage of previous courses containing overt behavioral and non-behavioral objectives. They did this for: the optometry program to date; their undergraduate program; their time at high school; and other degree or certificate programs, if applicable. Participants were provided with definitions and examples of behavioral and non-behavioral objectives. An objective was considered behavioral if it included an observable behavior describing what the student should be able to do, and included the particular content on which the student was to act (e.g., to do something with facts, concepts, procedures or instruments).3
  3. To determine students’ attitudes to the placement of objectives within each chapter (inquiry 3). Participants’ preferences for objectives placed before, within or after the text were assessed with a multiple-choice question.
  4. To determine how students use the self-tutorials, and to elicit their opinions on the quality of feedback in those tutorials (inquiry 4). Participants were asked to rate their level of use of self-tutorials on a Likert scale. Two open-ended questions asked participants to describe how they used the self-tutorials, and to comment on the quality of the feedback.
  5. To determine whether objectives increased the ease and efficiency of discovering what should be learned, and whether students’ prior expectations of what to learn were consistent with the tests (inquiries 5 and 6). The ease and efficiency of discovering what should be learned were assessed with Likert scales. Participants were asked in open-ended questions if any objectives had hindered their study, if test content agreed with their prior expectations, and to provide examples of test questions of an unanticipated type.

The students were asked to take part in this survey after completion of the course. Two research assistants made brief recruiting presentations to the class, sent e-mail invitations to participate in the study, and personally approached students. They mailed individually addressed survey copies to students who expressed interest in the survey. Participants were allowed to take as much time as needed to complete the survey, and they returned the completed surveys by internal institutional mail at no cost. The research assistants sent reminders to participants to return completed surveys.

Where Likert scales were used, they were of the form strongly agree (SA), agree (A), neither agree nor disagree (N), disagree (D), and strongly disagree (SD). The study was designed to meet ethical considerations in educational action research.25, 26 Informed consent was obtained from each participant. For anonymity, the class year was not included in this report. The SCCO Institutional Review Board determined the study to be “exempt” from review.

Results

Twenty-two participants completed the survey, a response rate of 21%. The replies to all open-ended questions were coded to a smaller number of concepts using content analysis.6

How do students actually use objectives?

Participants used a Likert scale to rate their level of agreement with each of 18 published strategies for the use of objectives9, 12, 13 (Table 1). The binomial test was used to discover significant levels of agreement or disagreement with each strategy. Post-hoc power estimates were made.27 For a two-tailed test with α = 0.05, these tests had 80% power to detect a change of ± 31% away from the null hypothesis of 50% of participants agreeing and 50% disagreeing. Thus, the binomial test was well-powered to detect strong levels of participant agreement and disagreement. The Bonferroni-corrected significance level (α) of 0.002 778 (that is, 0.05/18) was also used to control for family-wise error rates.

The participating cohort reported significant levels of agreement and disagreement with various strategies, and these results are summarized in Table 1. Participants were then asked, “How often did you use the objectives in Visual Optics?” Nineteen of 22 used the objectives often or very often, and the remaining three used them occasionally.

Twenty-one participants described how they used objectives in the course. Twelve participants provided reasonably detailed linear accounts of the steps they usually followed, eight provided possibly incomplete narratives that did not contain multiple steps, and one described a non-linear approach in which the objectives and self-tutorial were used to create a personal study guide.

In the group of 12 participants who described linear processes, only four described the three parts of the cognitive mediation paradigm in order.11 For example, here is participant 4’s response (with the three steps annotated): “I would read the objectives [Step 1, interpretation of objectives], then read all of the material (text) and highlight the information pertaining to the objectives [Step 2, goal-directed learning]. After I would read group study notes & questions that quizzed me on my study/reading [Step 3, self-test to the objective].” A diversity of linear, stepped approaches was found with the remaining eight participants.

The responses of all 21 participants were then read to identify the presence or absence of parts of the cognitive mediation paradigm,11 regardless of order. Ten participants described activities consistent with interpretation of the objectives, 12 described goal-directed learning, and 11 reported self-testing to the objectives.

Responses were read to identify other characteristics of students’ uses of objectives. First, with regard to when the objectives were used, five participants commenced their descriptions as they were approaching a test. Second, some participants gave explicit reasons for their use of objectives. These reasons were classified as orientating oneself to the topic (two participants), goal-directed learning (one), self-testing (three), reinforcement (one), and test preparation (two). Finally, the relationships between the participants and other students in the class were assessed. Most participants (19 of 21) did not mention anyone else present with them. Only one participant related how he or she used the objectives to quiz classmates during group study, and another noted that the instructor covered the objectives during the lecture.

Do students have previous experience in the use of objectives?

Participants generally had extensive experience with behavioral and non-behavioral course objectives (Table 2).

What are students’ attitudes to the placement of objectives within each chapter?

Twenty-two participants completed this question and, of these, two expressed split preferences, which were distributed to the respective categories as half-scores. Fifteen preferred objectives placed only at the start of the chapter. ‘Two and a half’ participants each preferred objectives placed at the end of the chapter or within the chapter. Two participants preferred objectives placed at multiple locations.

How do students use self-tutorials and what are their opinions on the quality of feedback?

When participants were asked whether they completed the self-tutorials, the Likert scale responses (SA, A, N, D, SD) were (14, 5, 2, 1, 0). Twenty-one participants described how they used the self-tutorials. Six read or studied the self-tutorials, one read the self-tutorials to provide a focus for studying, 10 answered the questions, two checked their answers against the text, five used the self-tutorials in self-testing, one memorized the questions and answers, four shared answers, and two read others’ answers.

Three participants emphasized that they strived to produce detailed answers and even extra annotations so that the product would be a comprehensive study tool. Four participants used the self-tutorials as study guides or as the basis for creating study guides, or outlines and flashcards. These approaches are interesting for their creativity. Participant 3 did not create a study guide but noted, “the self-tutorials were the bulk of my studying for the course.”

When participants were asked whether the answers in the self-tutorials provided an appropriate amount of feedback, the Likert scale responses (SA, A, N, D, SD) were (10, 9, 1, 2, 0). Participants were then asked to comment on the quality of feedback, and 21 answered the question. All but one participant did try to make use of supplied feedback. The most common comment was that the feedback in the self-tutorial answers was helpful or good (10 participants). Other positive comments were that feedback was extremely helpful (one), just right (one), without extraneous details (one), sufficient (one), and especially helpful for questions not directly answered by a fact located in the text (one). Negative comments were that the answers were sometimes too brief (two) and did not explain how the answer was obtained (two), that many of the answers were too general (three), requiring more direct answers or clues (one). Four participants mentioned the balance between the provision of answers and the need for practice with one participant stating, “The answers were good because sometimes they were straightforward and others led you to the correct answer so you would understand it better.”

Do objectives increase the ease and efficiency of study and provide correct test expectations?

Twenty-two participants completed this survey section. When they were asked whether the objectives made it easier to know what they should learn, the Likert scale responses (SA, A, N, D, SD) were (9, 11, 2, 0, 0). When asked whether objectives made their study time more efficient, the Likert scale responses (SA, A, N, D, SD) were (13, 7, 2, 0, 0). When asked whether the test questions were consistent with their expectations from the course objectives, the Likert scale responses (SA, A, N, D, SD) were (13, 8, 1, 0, 0). Finally, when asked whether the test questions were consistent with their expectations from the self-tutorial exercises, the Likert scale responses (SA, A, N, D, SD) were (14, 6, 2, 0, 0). Participants were asked if aspects of objectives in the course had hindered their study. All 20 who answered this question replied ‘no’ (expressed in various ways).

Finally, when asked if any test questions were unanticipated, of the 20 participants who completed this question, five noted unexpected questions from the chapter on ocular aberrations and one stated that the questions on laboratory class topics were unexpected, and recommended more guidance to prepare for these.

Discussion

Inquiry 1. How do students actually use objectives?

Attention to objectives

Participants in this study were paying attention to the objectives, as evidenced by the strong disagreement with strategy 8 in Table 1. Most used them often or very often. Other estimates of students’ actual use of objectives vary widely.7–9, 11 Mast et al. identified several factors in the use of objectives by medical students,13 and many of these factors were probably favorable in the current course. Nevertheless, the wide variety in students’ attention to objectives7–11 suggests that the cognitive mediation paradigm11 should be modified to include attention to the objective as one of its components.

Student use of objectives

A wide variety of strategies and patterns were found in this study. Some of these are consistent with published strategies9, 12, 13 (Table 1), and some are consistent with the cognitive mediation paradigm.11 Participants in the current study expressed statistically significant agreement with only eight of 18 published strategies for the use of objectives (Table 1). There are not strong a-priori reasons to expect that all students everywhere should indeed all use the same set of strategies, since some strategies may be more effective than others in particular courses, programs or educational settings. As an example of the diversity in educational settings, Duchastel studied female students of a Swiss college (equivalent to U.S. grade 11 and 12),9 and Bassett and Kibler12 and Mast et al.13 studied, respectively, communications students and medical students, at U.S. universities.

Past studies demonstrate a focusing effect of behavioral objectives: They increase instructor-specified learning, while suppressing incidental and self-directed learning.13, 16, 17, 28–32 Consistent with these findings, participants in the current study did not generally formulate their own objectives (Table 1, strategy 17). Nevertheless, the focusing effect was incomplete: They also studied parts of the course materials not directly covered by the objectives (Table 1, strategies 9 and 15). This suppression of self-directed learning may be a concern to some instructors. If so, it is possible to counter the effect through course activities such as goal-setting exercises,33–35 practical scenarios36 and assignments, where students are encouraged to set their own learning goals.

Consistent with earlier studies,9, 12, 13 participants in this study rarely mentioned others in their descriptions of how they used the objectives or self-tutorials. However, few researchers have studied how course objectives might influence students’ learning relationships and the ways in which students seek to help their peers.21, 37

Only a small proportion of participants reported a step-wise process for their use of objectives consistent with the order of the parts of the cognitive mediation paradigm.11 Nevertheless, about half of the participants reported one or more of the parts somewhere in their responses. These results are somewhat supportive of the cognitive mediation paradigm. Furthermore, the current content analysis suggests that the components of the cognitive mediation paradigm11 should be considered as parts rather than as ordered steps.

Inquiry 2. Do students have previous experience in the use of objectives?

This group of optometry students had extensive experience with both types of objectives by self-report (Table 2). Of concern in interpreting these data are the large standard deviations for the optometry program, where all students actually take the same courses (apart from a few students who have the option to add one or more elective courses). This suggests that participants’ interpretations of the supplied definitions for behavioral and non-behavioral objectives may have varied considerably. An alternative could be for the investigator to perform a detailed analysis of the text of actual course materials. In this way, the investigator could apply the definitions for behavioral and non-behavioral objectives carefully and precisely, rather than relying on students’ memories of their course syllabi.

Inquiry 3. Are there ways to present objectives to enhance learning?

A majority of participants preferred the current placement of objectives at the start of each chapter. In contrast, previous studies found best performance for objectives interspersed within the paragraphs of the text.14, 15 Perhaps student preferences are not optimal for performance. Possibly, students in this course preferred objectives at the start of the chapter because they had not been offered alternatives. A third possibility is that the objectives in this course required the student to do more than simply look for a fact in a nearby paragraph, which would otherwise have favored in-text placement. A fourth possibility is that the advance organizer at the start of each chapter made the list of objectives more accessible38 and possibly more useful.

Inquiry 4. Does completion of questions promote learning, irrespective of the ability to articulate an objective?

Practice and feedback are important components of learning39 and of objectives-based curricula.11, 18–22 It was satisfying to find that most participants were using the self-tutorials as a way to practice and to self-test, and that they tended not to rely simply on reading study group answers to objectives (Table 1, strategy 16). In addition, most students found the exam questions to be consistent with their anticipations from completing the self-tutorials.

Inquiry 5. Do objectives increase the ease and efficiency of discovering what should be learned?

Most participants stated that objectives made their study time more efficient, consistent with previous research.13, 23 They also found that the objectives made it easier for them to know what to learn, in contrast to two previous studies.18, 19 Participants could not recall any way in which objectives had hindered their study, and this is consistent with previous studies.13, 17, 40, 41

Inquiry 6. Do exam questions meet students’ prior expectations of learning?

Most participants found that the exam questions were consistent with their expectations from reading the course objectives and from completing the self-tutorials. This is a useful finding because providing correct expectations has been proposed as an important rationale for the use of objectives,24 and only one previous study of a genuine educational setting has asked students to rate whether exams met their expectations.13

Strengths and limitations of the study

Overall this study has added to the small amount of information on how students actually9, 12, 13 use supplied objectives in a genuine13 educational context, and it is the first to report students’ prior experience with behavioral and non-behavioral objectives.

Additionally, this study’s extensive content analysis of student uses of objectives suggests new directions for future research to address criticisms of earlier approaches.11, 40–42 One new direction would be to make detailed studies of individual students within particular educational settings. An ethnographic approach would be well-suited to this purpose.43, 44 This is important because with the exception of Mast et al.13 and of the current study, researchers have wholly ignored genuine courses within genuine educational settings. A second direction for new research, suggested by Duchastel and Merrill,40 would be to make factorial experimental studies of the complex interactions between objectives and other characteristics of the educational settings, with a view to theory development.

One limitation of this study is that although multiple, non-coercive recruitment methods were used, the resulting sample has the potential for selection bias from non-response bias.45 This in turn potentially limits generalizations from the current findings to the whole class.

A limitation of the paper survey is that written descriptions provided by the participants may have been incomplete or the verbiage they used was difficult to interpret without further questioning. (For example, contrary to the cognitive mediation paradigm, no participants specifically mentioned ‘interpreting’ the objectives. Instead, they used words such as ‘reading’ and ‘reviewing’.) More complete accounts could be elicited using interview or observational methodologies.

Although the current findings from a visual optics course in an optometry program are not formally generalizable to other courses and programs, some instructors may nevertheless wish to make use of the findings in their courses. Mast et al.’s study of medical student education may also be helpful for its analysis of several factors in the use of objectives across a healthcare curriculum.13 For example, students in that medical program reported that objectives were more useful in the basic science track than in pre-clinical and clinical tracks.

A new model of students’ use of behavioral objectives

The descriptions of real students’ actual uses of objectives in this study are quite complicated. Therefore the results of the current study were combined with those from three other studies of students’ natural use of objectives,9, 12, 13 and with the cognitive mediation paradigm11 to provide a model that can be tested in future studies. The components of the new model are:

  1. Instructor, Course and Curriculum. When students enter a course that has behavioral objectives, various factors such as instructor emphasis,13 the quality (clarity) of the written objectives,13 and alignment of testing to the objectives13 can lead the students to adopt a:
  2. “Doing” Orientation. The behavioral objectives lead the student to ask, “What must I be able to do?” (as opposed, say, to “What must I remember?”). With this orientation, students demonstrate:
  3. Attention to the Objectives. This attentional focus is not complete, as students still engage separately in incidental learning (e.g., reading the handbook without objectives in mind). Once students attend to the objectives, the following four components of learning may be found (not necessarily in this order):
  4. Interpretation of the Objectives11–13
  5. Goal-Directed Learning11–13
  6. Self-Testing to the Objectives.11–13 This includes practice (e.g., completing self-tutorial exercises)
  7. Constructive Learning. An example is the creation of original, personal study guides.

Conclusions

Students used behavioral objectives in a wide variety of ways, some of which are consistent with past studies and with Jiang and Elen’s cognitive mediation paradigm.11 It is suggested that the cognitive mediation paradigm be expanded to a seven-component model that captures the main themes of students’ natural uses of objectives noted in the current study and in previous studies. Clearly written objectives that covered material emphasized in the course, along with self-tutorials made learning easier and more efficient and provided appropriate expectations of examination questions.

Acknowledgments

This study was supported by an Educational Starter Grant to LRS from the Association of Schools and Colleges of Optometry and The Vision Care Institute, LLC, an affiliate of Johnson & Johnson Vision Care, Inc. Part of this study’s findings was presented at the Annual Meeting of the American Academy of Optometry, Denver, Nov. 12-15, 2014.

References

  1. Ausubel DP. Educational psychology : A cognitive view. New York: Holt, Rinehart and Winston; 1968.
  2. Ausubel DP. In defense of advance organizers: A reply to the critics. Rev Educ Res [Internet]. 1978 [cited 2013 Apr 27];48(2):251–7. Available from: https://www.jstor.org
  3. Mager RF. Preparing instructional objectives. Belmont: Fearon Publishers; 1962.
  4. Anderson LW, Krathwohl DR, Airasian PW, Cruikshank KA, Mayer RE, Pintrich PR, Raths J, Wittrock MC, editors. A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. Abridged ed. New York: Longman; 2001.
  5. Gronlund NE. Writing instructional objectives for teaching and assessment. 7th ed. Upper Saddle River: Pearson; 2004.
  6. McKernan J. Curriculum action research: A handbook of methods and resources for the reflective practitioner. London: Kogan Page; 1991.
  7. Duell OK. Effect of type of objective, level of test questions, and the judged importance of tested materials upon posttest performance. J Educ Psychol. 1974;66(2):225–32.
  8. Raghubir KP. The effects of prior knowledge of learning outcomes on student achievement and retention in science instruction. J Res Sci Teach [Internet]. 1979 [cited 2012 Apr 23];16(4):301–4. Available from: https://onlinelibrary.wiley.com/.
  9. Duchastel P. Learning objectives and the organization of prose. J Educ Psychol [Internet]. 1979 [cited 2013 Apr 23];71(1):100–6. Available from: https://www.ebscohost.com/.
  10. Tobias S, Duchastel PC. Behavioral objectives, sequence, and anxiety in CAI. Instr Sci [Internet]. 1974 [cited 2012 Apr 23];3(3):231–42. Available from: https://link.springer.com/.
  11. Jiang L, Elen J. Why do learning goals (not) work: A reexamination of the hypothesized effectiveness of learning goals based on students’ behaviour and cognitive processes. Educ Technol Res Dev [Internet]. 2011 [cited 2012 Apr 23];59(4):553–73. Available from: https://link.springer.com/.
  12. Bassett RE, Kibler RJ. Effect of training in the use of behavioral objectives on student achievement. J Exp Educ [Internet]. 1975 [cited 2012 Apr 23];44(2):12–6. Available from: https://www.jstor.org
  13. Mast TA, Silber DL, Williams RG, Evans GP. Medical student use of objectives in basic science and clinical instruction. J Med Educ [Internet]. 1980 [cited 2012 Apr 13];55(9):765–72. Available from: https://www.lww.com/.
  14. Aboderin AO, Thomas M. An evaluation of the influence of behavioral objectives on Nigerian students’ cognitive achievement in biology. Res Sci Technol Educ. 1996;14(2):193–204.
  15. Kaplan R. Effects of learning prose with part versus whole presentations of instructional objectives. J Educ Psychol [Internet]. 1974 [cited 2013 Apr 23];66(5):787–92. Available from: https://www.ebscohost.com/.
  16. Kaplan R. Effect of experience and subjects’ use of directions upon learning from prose. J Educ Psychol [Internet]. 1976 [cited 2013 Apr 23];68(6):717–24. Available from: https://www.ebscohost.com/.
  17. Klauer KJ. Intentional and incidental learning with instructional texts: A meta-analysis for 1970–1980. Am Educ Res J [Internet]. 1984 [cited 2012 Apr 23];21(2):323–39. Available from: https://www.sagepub.com
  18. Martin F, Klein JD, Sullivan H. The impact of instructional elements in computer-based instruction. Br J Educ Technol. 2007;38(4):623–36.
  19. Martin F, Klein J. Effects of objectives, practice, and review in multimedia instruction. J Educ Multimed Hypermedia [Internet]. 2008 [cited 2012 Apr 13];17(2):171–89. Available from: https://www.proquest.com/.
  20. Hannafin MJ. The effects of orienting activities, cueing, and practice on learning of computer-based instruction. J Educ Res (Wash DC) [Internet]. 1987 [cited 2012 May 7];81(1):48–53. Available from: https://www.jstor.org
  21. Klein JD, Pridemore DR. Effects of orienting activities and practice on achievement, continuing motivation, and student behaviors in a cooperative learning environment. Educ Technol Res Dev [Internet]. 1994 [cited 2012 Apr 23];42(4):41–54. Available from: https://link.springer.com/.
  22. Phillips TL, Hannafin MJ, Tripp SD. The effects of practice and orienting activities on learning from interactive video. Educ Commun Technol. 1988;36(1):93–102.
  23. Rushin JW, Baller W. The effect of general objectives defined by behavioral objectives on achievement in a college zoology course. Coll Stud J. 1981;15(2):156–61.
  24. Deterline WA. The secrets we keep from students. In: Kapfer MB, editor. Behavioral objectives in curriculum development: Selected readings and bibliography. Englewood Cliffs: Educational Technology Publications; 1971. pp. 3–8.
  25. Noddings N. Fidelity in teaching, teacher education, and research for teaching. Harv Educ Rev. 1986;56(4):496–510.
  26. Zeni J. A guide to ethical issues and action research. Educ Action Res [Internet]. 1998 [cited 2012 May 9];6(1):9–19. Available from: https://taylorandfrancisgroup.com/.
  27. Kraemer HC, Thiemann S. How many subjects? Statistical power analysis in research. Newbury Park: Sage Publications; 1987.
  28. Rothkopf EZ, Kaplan R. Exploration of the effect of density and specificity of instructional objectives on learning from text. J Educ Psychol. 1972;63(4):295–302.
  29. Duchastel PC, Brown BR. Incidental and relevant learning with instructional objectives. J Educ Psychol. 1974;66(4):481–5.
  30. Jones MB. The effect of reading purposes on children’s reading achievement. J Read Behav [Internet]. 1976 [cited 2012 Apr 16];8(4):405–13. Available from: https://www.sagepub.com
  31. Barker D, Hapkiewicz WG. The effects of behavioral objectives on relevant and incidental learning at two levels of Bloom’s taxonomy. J Educ Res (Wash DC). 1979;72(6):334–9.
  32. Petersen C, Glover JA, Ronning RR. An examination of three prose learning strategies on reading comprehension. J Gen Psych [Internet]. 1980 [cited 2013 Apr 30];102(1):39–52. Available from: https://www.ebscohost.com/.
  33. Morgan M. Self-derived objectives in private study. J Educ Res (Wash DC). 1981;74(5):327–32.
  34. Dolcourt JL, Zuckerman G. Unanticipated learning outcomes associated with commitment to change in continuing medical education. J Contin Educ Health Prof. 2003;23(3):173–81.
  35. Manlove S, Lazonder AW, de Jong T. Trends and issues of regulative support use during inquiry learning: Patterns from three studies. Comput Hum Behav [Internet]. 2009 [cited 2013 Apr 17];25:795–803. Available from: https://www.sciencedirect.com/.
  36. Zumbach J, Reimann P. Enhancing learning from hypertext by inducing a goal orientation: Comparing different approaches. Instr Sci [Internet]. 2002 [cited 2013 Apr 17];30:243–67. Available from: https://link.springer.com/.
  37. Civikly JM. A case for humanizing behavioral objectives. Commun Educ. 1976;25(3):231–6.
  38. MacDonald-Ross M. Behavioral objectives—a critical review. Instr Sci [Internet]. 1973 [cited 2012 Apr 30];2:1–52. Available from: https://link.springer.com/.
  39. Popham WJ. Transformative assessment. Alexandria: Association for Supervision and Curriculum Development; 2008.
  40. Duchastel PC, Merrill PF. The effects of behavioral objectives on learning: A review of empirical studies. Rev Educ Res [Internet]. 1973 [cited 2012 Apr 23];43(1):53–69. Available from: https://www.sagepub.com
  41. Melton RF. Resolution of conflicting claims concerning the effect of behavioral objectives on student learning. Rev Educ Res [Internet]. 1978 [cited 2012 Apr 23];48(2):291–302. Available from: https://www.sagepub.com
  42. Biesta G. Why “what works” won’t work: Evidence-based practice and the democratic deficit in educational research. Educ Theory [Internet]. 2007 [cited 2013 Apr 30];57(1):1–22. Available from: https://onlinelibrary.wiley.com/.
  43. Noblit GW, Engel JD. The holistic injunction: An ideal and a moral imperative for qualitative research. In: Noblit GW, editor. Particularities: Collected essays on ethnography and education. New York: Peter Lang; 1999. pp. 53–60. (Counterpoints : Studies in the postmodern theory of education; vol. 44).
  44. Fetterman DM. Ethnography : Step by step. 2nd ed. Thousand Oaks: Sage Publications; 1998. (Applied social research methods series; vol. 17).
  45. Smith TMF. On the validity of inferences from non-random samples. J R Stat Soc A [Internet]. 1983 [cited 2015 Aug 6];146(4):394–403. Available from: https://www.jstor.org

 

 

 

 

 

 

 Save article as PDF

Dr. Stark [lstark@ketchum.edu] is an Associate Professor at Southern California College of Optometry at Marshall B. Ketchum University. His educational research centers on aims in higher education, and on course evaluation. His vision research centers on human ocular accommodation and on visual optics.