• No results found

University of Groningen Knowledge and skills acquisition in medical students Cecilio Fernandes, Dario

N/A
N/A
Protected

Academic year: 2021

Share "University of Groningen Knowledge and skills acquisition in medical students Cecilio Fernandes, Dario"

Copied!
143
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

University of Groningen

Knowledge and skills acquisition in medical students

Cecilio Fernandes, Dario

IMPORTANT NOTE: You are advised to consult the publisher's version (publisher's PDF) if you wish to cite from it. Please check the document version below.

Document Version

Publisher's PDF, also known as Version of record

Publication date: 2018

Link to publication in University of Groningen/UMCG research database

Citation for published version (APA):

Cecilio Fernandes, D. (2018). Knowledge and skills acquisition in medical students: exploring aspects of the curriculum. University of Groningen.

Copyright

Other than for strictly personal use, it is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license (like Creative Commons).

Take-down policy

If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.

Downloaded from the University of Groningen/UMCG research database (Pure): http://www.rug.nl/research/portal. For technical reasons the number of authors shown on this cover page is limited to 10 maximum.

(2)

Knowledge and

sKills acquisition

in medical students:

Exploring aspects of the curriculum

(3)

Dario Cecilio Fernandes

Knowledge and skills acquisition in medical students: exploring aspects of the curriculum

ISBN (print) 978-94-034-0892-7 ISBN (digital) 978-94-034-0891-0

Dissertation, University of Groningen, the Netherlands.

The research in this dissertation was partly funded by CAPES – Brazilian Federal Agency for Support and Evaluation of Graduate Education (grant 9568-13-1) awarded to Dario Cecilio-Fernandes and University Medical Center Groningen (UMCG). The studies presented in this thesis were carried out in the context of research institute SHARE.

Financial support for the publication of this dissertation was kindly provided by research institute SHARE, University Medical Center Groningen, and University of Groningen.

Cover, layout and print production: Off Page, Amsterdam

© 2018 Dario Cecilio Fernandes

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means without written permission of the author or, when appropriate, of the copyright-owning journals for previously published chapters.

(4)

Knowledge and

sKills acquisition

in medical students:

Exploring aspects of the curriculum

PhD thesis

to obtain the degree of PhD at the University of Groningen

on the authority of the Rector Magnificus Prof. E. Sterken

and in accordance with the decision by the College of Deans.

This thesis will be defended in public on Wednesday 19 September 2018 at 09.00 hours

by 

Dario Cecilio Fernandes  born on 31 May 1984 in São Carlos/SP, Brazil

(5)

supervisor

Prof. A.D.C Jaarsma

co-supervisors

Dr. F. Cnossen Dr. R.A. Tio

assessment committee

Prof. A. Freeman Prof. N.A. Bos Prof. J.P.E.N Pierie

(6)

table of contents

Chapter 1 General Introduction 7 Chapter 2 The Impact of Curriculum Design in the Acquisition of 25

Knowledge of Oncology: Comparison Among Four Medical Schools

Chapter 3 The Impact of Massed and Spaced-Out Curriculum in 35 Oncology Knowledge Acquisition

Chapter 4 Development of cognitive processing and judgments of 45 knowledge in medical students: analysis of progress test results

Chapter 5 Comparison of formula and number-right scoring in 59 undergraduate medical training: a Rasch model analysis

Chapter 6 Avoiding surgical skill decay: a systematic review on 79 the spacing of training sessions

Chapter 7 The effect of expert and augmented feedback on the acquisition and 97 retention of a complex medical skill

Chapter 8 General Discussion 115 Chapter 9 Summary 129 Samenvatting 134 Acknowledgements 137 Research Institute SHARE 140

(7)
(8)
(9)
(10)

GENERAl INTRODUCTION

9

The aim of medical curricula is to encourage and support students to develop their knowledge and skills until they are competent to practice medicine. Many choices need to be made when designing a medical curriculum to ensure that students acquire knowledge and skills to the expected level of proficiency at graduation. These choices should reflect aspects of the curriculum that stimulate or facilitate students’ learning processes. In this thesis, we investigated the relation between various aspects of the curriculum and students’ knowledge and technical skills acquisition. In addition, we explored the use of a progress test to evaluate the relation between aspects of the curriculum and students’ knowledge development.

curriculum and students’ Knowledge

development

There are two main approaches to investigate the effects of curriculum design and interventions on students’ knowledge development: laboratory studies and curriculum-level studies. laboratory studies are controlled experimental studies in which a control group receives the usual education, and curriculum-level studies are studies comparing different curricula or tracks at a curriculum level. Unfortunately, there seems to be a mismatch between the outcomes of both types of research. laboratory studies, for example, have shown that a problem-based curriculum leads to better learning outcomes,1-3

whereas curriculum-level studies revealed little or no difference in learning outcomes.4-7

Although students who followed an active-learning curriculum were more engaged in their learning, graduated faster and more often than their peers in a more lecture-based curriculum,8-10 they seemed to have similar knowledge levels.4-6 The differences in results

of laboratory and curriculum-level research show how difficult it is to translate laboratory research findings to practice.

In the medical education literature, there is disagreement about the cause of the mismatch between the results of both kinds of curriculum effect studies. In a recent editorial, Geoff Norman discussed the mismatch between laboratory studies and the real world of the medical curriculum.11 On the one hand, the mismatch may be due to

difficulties in conducting experimental studies in medical education,12 but on the other

hand, Albanese and Mitchell argue that curriculum-level research is often weak by its design.4

There are arguments for and against the use of laboratory and curriculum-level studies when investigating curriculum effects on medical students’ learning outcomes. Norman and Schmidt criticized the use of laboratory studies to investigate curriculum interventions, arguing that (1) randomization, especially blinding the participants to their condition, is almost impossible in educational research; (2) controlling variables is very difficult because of the many variables involved in a curriculum; and (3) it is difficult to choose adequate student outcomes measurements because these may vary drastically across

(11)

CHAPTER 1

10

medical curricula.13 As a corollary, translating laboratory findings to a real curriculum

is also difficult, because important factors that may determine the success or failure of the intervention in the real curriculum may have been omitted.12 Controlling students’

access to the study material, for example, may decrease the replicability of the study in a real curriculum where students’ access to the study material is not controlled. Studies at curriculum level are often hampered by a lack of a proper control group. Often, historical control data, another track within the same school or a track from a different school have been used for the comparison.4,14-16 Schmidt pointed out that research on students’

learning outcomes over time may be influenced by many confounders, such as the use of volunteers as study participants, difficulties in establishing comparable groups, and differential exposure to the instrument that is used to measure curriculum effects.17 These

confounders may explain the difficulty of finding differences in students’ knowledge levels when comparing different curricula. The confounders, however, could be understood as aspects of each individual curriculum. The answer may lie in identifying aspects of the curriculum in different curricula that may explain students’ knowledge development. In this thesis, aimed at identifying such aspects of the curriculum, students’ knowledge development was compared between different curricula. Additionally, aspects of the curriculum that may have influenced students’ knowledge development were investigated.

curriculum and technical sKills acquisition

Medical curricula include technical and non-technical skills training. Technical skills are often considered motor or psychomotor skills that medical doctors need to perform. For example, suturing is a technical skill that all medical doctors have to learn. Non-technical skills are the cognitive and interpersonal skills that complement knowledge and technical skills.18 The teaching of non-technical skills has been integrated in the medical

curricula because of concerns that poor performance of non-technical skills could lead to errors or disciplinary actions.19, 20 Nowadays, technical as well as non-technical skills are

explicitly taught.

Technical and non-technical skills are part of both undergraduate and postgraduate medical curricula. Although different types of technical skills are relevant for each type of curriculum, non-technical skills training is very similar in both curricula. For example, some technical skills are only taught to residents because of the specificity of the specialty in question, whereas teamwork and communication skills are taught in under- and postgraduate training.

In the past decades, there has been an increase in research regarding non-technical skills in undergraduate medical training. The introduction of competency-based education was one of the reasons for this increase. Competency-based curricula not only focus on medical expertise (knowledge and technical skills), but also on attitude and behavior.

(12)

GENERAl INTRODUCTION

11

Competency frameworks (i.e. CanMEDS and Tomorrow’s doctor) have been implemented widely in both undergraduate and postgraduate medical training.21,22

Technical skills have mostly been taught in the clinical setting, under the assumption that students gain competence over time simply by being exposed to patients and clinical experiences.23 McGaghie et al. argued that this kind of training lacks structured learning

objectives, skills practice and objective assessment with proper feedback.23 Over the past

decades, skills training in medical curricula has been shifting towards simulation training. In a simulated setting, students are offered the opportunity to practice their skills without time restrictions and with structured feedback. Aspects of the curriculum may affect students’ skills acquisition either deliberately or unintentionally.

In addition to the relation between various aspects of the curriculum and students’ knowledge development, this thesis investigates how various aspects of the curriculum affect students’ technical skills acquisition.

curriculum design

Aspects of the curriculum reflect the embodiment of curriculum design, relating to the teaching methods, assessment programme or the rationale for the design. First, a short overview regarding curriculum design is given, after which the specific aspects of the curriculum investigated in this thesis are described.

According to Prideaux, at least four elements should be taken into account when designing a curriculum: content, teaching and learning strategies, assessment processes and evaluation processes.24 In curriculum design, these elements are linked and should

not be considered as separate steps. Since this thesis focusses on students’ knowledge development and technical skills acquisition, the description of the four elements is from a students’ perspective. The content of a curriculum focusses on the acquisition of knowledge, skills and attitudes (or competencies). Teaching and learning strategies refer to the didactics and pedagogy, for example whether the learning activities are student oriented, the teaching methods used and the opportunity to learn in real-life settings.24

Assessment processes focus on evaluating students’ knowledge, skills and attitudes, which requires a clear test blueprint and different stakes. Evaluation processes refer to students’ feedback regarding the curriculum.24 All these elements are chosen in a given

context, which will influence the curriculum design (Figure 1). In this thesis, the focus is on aspects of the curriculum that either belong to the teaching and learning strategies or to the assessment processes.

(13)

CHAPTER 1

12

teaching and learning strategies

Knowledge development

Problem-based learning (PBl) has increasingly been used as a teaching and learning strategy in medical education. According to Schmidt, Cohen-Schotanus, and Arends,9

PBl has six defining aspects: (1) problems as the starting point for learning, (2) small group collaboration, (3) flexible guidance of a tutor, (4) less lecture hours because (5) learning should be initiated by the students and (6) students need time for self-study. Although these six aspects are the core of problem-based learning, their intensity may vary between different problem-based curricula. In this thesis, we explored several aspects of the curriculum such as the degree of problem-based learning, the number of contact hours and whether a discipline is taught in a single, concentrated semester or spaced over a period of two or more semesters.

Only defining a curriculum as problem-based learning may be misleading, since many aspects of problem-based curricula may vary across different curricula. Severiens and Schmidt28 found that students in an active-learning curriculum have higher levels of social

and academic integration than students in a mixed curriculum (combination of lectures and some methods of active learning) and a lecture-based curriculum. Barrows classified a taxonomy of a problem-based curriculum based on four important objectives in medical education, (1) structuring knowledge in clinical cases, (2) developing a clinical reasoning process, (3) development of self-directed learning skills and (4) motivation for learning.29 Figure 1. Figure adapted from Prideaux (2007)25

Learning Objectives/Goal Content Teaching and learning strategies Assessment processes Knowledge Skills Attitudes Competency Context

(14)

GENERAl INTRODUCTION

13

This taxonomy intends to identify the degree to which these objectives are addressed in problem-based learning curricula.

With the implementation of PBl curricula, the number of contact hours was drastically decreased.9 Research demonstrated that end-level students had acquired the same

amount of knowledge, regardless of the number of contact hours they had.9,30 Novice

students who had more contact hours, however, had acquired more knowledge than students with less contact hours.30 Usually, comparison of contact hours is made after

curriculum change, for example, when a previous cohort is compared with the new cohort. In this thesis, however, we compare the number of contact hours across different medical schools, which allows us to investigate whether contact hours are important in different contexts.

Spacing a discipline over more than one semester is not a core aspect of PBl. However, studies from cognitive psychology have shown that spacing the study material over time instead of letting students study all material at a single moment (massed session) has been proven to improve long-term knowledge retention.26,27 This so-called spacing effect

is well-established for knowledge, but less for knowledge development in a naturalistic setting. To unravel knowledge transfer in a naturalistic setting, there is a need for studies exploring the effects of spaced learning sessions on knowledge development.

Skill acquisition

Medical students usually have to repeat a medical skill many times before they are able to perform it proficiently.23 Technical skills training varies across curricula with large

variations in training duration depending on the complexity of the skill and the availability of instructors, such as clinicians or other health care professionals. It is often not possible for curriculum developers to guarantee enough practice time for their trainees due to time and resource constraints. Finding an optimal training schedule, however, may reduce the amount of resources required as well as the number of trainings needed.

Feedback provides information to the students on how well they perform the actions of a task.31 Furthermore, feedback can draw students’ attention to important aspects of

the task.32,33 Interestingly, several studies have shown that is more effective to focus on

external information than on the actual movement.34-36 Although feedback is essential for

learning skills, there are still many aspects of feedback that are not clear, especially when the complexity of many medical skills is taken into account.

Traditionally, experts guide medical students’ acquisition of a skill by telling and showing them what they have to do. With the use of a simulator, however, experts may take advantage of other formats of feedback. Many simulators have feedback features, such as help screens or haptic feedback. Combining different sources of feedback may reduce students’ cognitive load during skill acquisition;33,37,38 however, studies

on the effects of combining different sources of feedback on medical skills retention are lacking.37

(15)

CHAPTER 1

14

assessment

Assessment is key to ensure that students possess the necessary knowledge and can be designed in a way that it influences students’ study strategies.39 One way of steering

students’ study behaviour is by varying the type of questions in a test, which will influence what and how students study. Traditionally, types of questions are categorized as multiple-choice questions, open-ended questions, short answer questions, et cetera. Questions can also be categorized on the basis of the levels of Bloom’s taxonomy, which would provide information about students’ cognitive processing.40-42 Another way of steering students’

study behavior is by letting them decide and show whether they know the answer to a question.43 This allows students to make a judgment about their knowledge and

understanding, promoting students’ ability to reflect on their own knowledge.

Bloom’s taxonomy

Students’ cognitive processing is a cumulative hierarchy of lower and higher levels of acquired knowledge. Mastering the lower levels of the hierarchy is required before the higher levels can be achieved.42,44,45 The two lower levels refer to remembering and

a minimal understanding of facts and basic concepts.46 The third level refers to applying

the knowledge in a new situation. Whereas some researchers consider the third level as a transitional level,46 others consider it as a higher level of cognitive processing.47

The top three levels refer to drawing connections between ideas (analyzing), justifying a decision (evaluating) and creating new knowledge. They are considered as higher levels of cognitive processing,48 at which a deeper understanding of the knowledge is required.

However, the higher levels of cognitive processing are not necessarily hierarchically structured,46 which implies that students may be able to evaluate knowledge without

necessarily having to analyze their own knowledge. Since medical students are expected to use higher levels of cognitive processing when facing a patient, it is important to investigate how students develop their ability to apply knowledge.

Judgement of knowledge

Since medical doctors have to make high-stake decisions regarding their patients, it is crucial that medical students recognize what they know and what they do not know.43

Assessment may support students to develop their awareness of knowledge43 by asking

them to provide a judgment on their knowledge, which is known as metacognitive knowledge.45 In designing a test, one can choose to make students aware of their

knowledge gaps by offering an “I don’t know” option to multiple-choice questions. Several studies have investigated students’ judgments of knowledge in regular knowledge tests by adding such an “I don’t know” option. 43,49-51 Besides increasing the reliability

coefficients of the knowledge tests,43,49 students’ judgment of knowledge has a positive

correlation with test scores. This indicates that the higher students score on their judgment of knowledge, the better their performance is.52,53 It is not clear, however, whether the “I

(16)

GENERAl INTRODUCTION

15

don’t know” option stimulates medical students to develop their judgment of knowledge, or what the effect of such aspects of the assessment are on students’ test scores.

research context

In the Netherlands, the medical curricula are designed in accordance with the Bologna principles, consisting of a three-year pre-clinical bachelor and a three-year clinical master. Five out of the six studies in this thesis were performed at the University Medical Center Groningen, the Netherlands.

The medical curriculum of the University of Groningen is a competency-based curriculum, in which basic and clinical sciences are taught in an integrated way using small groups active learning sessions that start with a problem. During the clinical phase (year 4-6), students complete 15 to 17 clinical rotations in total. In their first master’s year, five-week periods of skills training (technical and non-technical) at the training centre are alternated with clerkship rotations at the hospital. In their second master’s year, students complete clerkship rotations of 4 weeks each. In their third master’s year, students have to choose a specialty for a clerkship that they will attend for 20 weeks. Students’ knowledge, skills and competencies are assessed throughout the pre-clinical and clinical phase. All students sit the progress tests, but their knowledge is also assessed by block assessments, mainly in the preclinical phase.

Knowledge development is explored in the first four studies of this thesis, using naturalistic data from the Dutch interuniversity progress test. The progress test is based on the Dutch National Blueprint for the Medical curriculum and is administered four times a year. Each progress test consists of 200 multiple-choice questions covering the whole domain of medical knowledge at end level. The questions differ for each test. All questions have an “I don’t know” alternative, hereafter called question mark option, allowing students not to answer a question. To calculate students’ test scores, a formula scoring method is used as follows: a correct answer is awarded with one point; an incorrect answer with a negative mark, which varies according to the number of answering alternatives per question; and choosing the question mark option results in a score of zero points. For more information about the Dutch progress test, see Wrigley et al.,54 Schuwirth & van der Vleuten,55 and Tio et al.56

Progress test results data from 2007 to 2013 were retrieved from the universities of leiden, Groningen, Nijmegen, and Maastricht. These data were used to explore the relation between medical students’ growth of oncology knowledge and aspects of the curriculum. To reduce the number of confounding factors, growth of oncology knowledge was further explored in the bachelor phase (year 1-3) of the University of Groningen. At that time, there were two parallel bachelor programmes with the same learning objectives: a Dutch and an international bachelor. The international students took the progress test in English.

(17)

CHAPTER 1

16

In the last two studies of this thesis, the focus shifts from knowledge acquisition to medical skills acquisition of undergraduate students. The last study of this thesis was conducted at the Wenckebach Skills Center of the University Medical Center Groningen.

central research question

Medical curricula aim, among other things, to support students’ learning. Evaluation of the curriculum often relies on students’ feedback regarding their satisfaction with the curriculum or the logistics. Adding students’ performance on knowledge tests to the curriculum evaluation process may give information about the relation between aspects of the curriculum and students’ knowledge. Kirkpatrick developed and revised his framework to evaluate the effectiveness of a training.57 This framework consists of

four levels. The first level, Reaction, measures how participants react to the training by measuring their engagement and satisfaction with the training. The second level,

Learning, assesses whether the participants acquired knowledge, skills, attitudes and

competencies. The third level, Behavior, analyzes how trainees apply the information they received during the training and whether the training had an effect on the participants’ work behavior. Finally, the fourth level, Results, assesses the outcomes of the training from an organizational perspective, i.e. whether the training had a positive effect on the organization. looking at the Kirkpatrick’s levels of evaluation, performance on a single knowledge test would allow us to measure the learning outcomes (level 2), and performance on multiple knowledge tests over time would allow us to identify changes in students’ behavior (level 3).57

Since assessment is an aspect of the curriculum which influences students’ study behaviour,61 it is imperative to investigate assessment itself. Progress tests are valid and

reliable assessment tools for measuring students’ knowledge development over time,54

while exerting a positive influence on student learning.55 Furthermore, progress tests

can be curriculum independent, for example, when the progress test is administered by a consortium.56 Progress tests have also been used as a benchmark between medical

schools, which allows the schools to assure that their students gain the same level of knowledge across different curricula.54,58-60 Therefore, using the progress test to assess

how aspects of the curriculum affect students’ knowledge development seems logical. Next to knowledge development, it is also important to understand how the aspects of the curriculum affect students’ technical skills acquisition. Two specific aspects of the curriculum were investigated with respect to skills training: (1) feedback and (2) spacing of the training sessions.

Overall, this thesis sought to answer the following research question:

How do aspects of the curriculum relate to students’ knowledge development and skill acquisition?

(18)

GENERAl INTRODUCTION

17

chapter overview

This thesis investigates how aspects of the curriculum affect students’ knowledge and skill acquisition. All studies in this thesis have been published or accepted for publication, except for Chapter 7, which has been submitted. Since the chapters of this thesis are based on articles written to be read on their own, repetition and overlap across the thesis is inevitable.

In Chapter 2, the relation was investigated between students’ knowledge development of oncology and various aspects of the curriculum, such as the degree of problem-based learning within a curriculum, number of contact hours, adding of a pre-internship training (refreshment course before entering the clinical rotation) and spacing a discipline. This explorative study, using naturalistic data of four different curricula, could of course not control for many confounding factors. For example, the number of patient encounters for each student during the clinical phase, learning objectives, context and teachers could not be controlled. Therefore, in the next study, we analyzed two parallel cohorts within the same curriculum. Different from Chapter 2, in Chapter 3, the context, the teachers, teaching methods and the assignments were similar in both cohorts. In Chapter 3 a comparison in curriculum design was made between teaching a discipline in one semester or spacing it over the entire preclinical training phase within the Groningen curriculum. The main difference between these two tracks was the timing of the exposure to oncology: one track had a block semester and the other track had discipline-specific training spaced over the preclinical phase. The aim of this study was to compare knowledge growth of students taught in a concentrated semester with that of students taught in a spaced format.

Whereas in the first two chapters the focus was on knowledge acquisition, the next step was to explore knowledge application. At the end of their undergraduate training, medical students are not only expected to remember basic factual knowledge (lower order), but also to apply it (higher order). They also should be aware of their current knowledge as well as their lack of knowledge. The question mark option is an important aspect of the assessment that curriculum designers can use to support students to develop their judgment of knowledge. In Chapter 4, students’ scores on lower and higher order progress test questions and their judgment of knowledge were compared. The question mark option in successive progress tests was used to measure students’ judgment of knowledge. Besides the educational aspect of judgement of knowledge investigated in Chapter 4, it is important to verify the effect of the “question mark option” on students’ scores. In Chapter 5, we investigated the effect of formula scoring by comparing students’ scores on questions with (formula scoring) with scores on questions without the question mark option (number-right scoring). Different from previous studies, we used Rasch analysis to compare formula scoring with number-right scoring. In contrast to the classical test theory, Rasch analysis is more sensitive to construct-irrelevant sources of variance.

(19)

CHAPTER 1

18

In the first four chapters, the focus is on knowledge acquisition and application, but in Chapters 6 and 7 the focus shifts towards skill acquisition and retention. Whereas in Chapters 2 and 3 the effect of spacing a discipline was investigated with respect to knowledge development, in Chapter 6 a systematic review was conducted to explore what is known about the spacing effect on medical skill retention. Finally, Chapter 7 describes a randomized experiment to investigate the effects of different feedback formats on skills acquisition and retention. More specifically, we investigated whether students would acquire and retain their skills better after expert feedback, simulator feedback or a combination of both types of feedback.

(20)

GENERAl INTRODUCTION

19

references

1. Capon N, Kuhn D. What’s so good about problem-based learning? Cogn

Instr. 2004;22(1):61-79.

2. De Grave WS, Schmidt HG, Boshuizen HP. Effects of problem-based discussion on studying a subsequent text: A randomized trial among first year medical students.

Instr Sci. 2001;29(1):33-44.

3. Schmidt HG, De Volder Ml, De Grave WS, Moust JH, Patel Vl. Explanatory models in the processing of science text: The role of prior knowledge activation through small-group discussion. J Educ

Psychol. 1989;81(4):610.

4. Albanese MA, Mitchell S. Problem-based learning: a review of literature on its outcomes and implementation issues.

Acad Med. 1993;68(1):52-81.

5. Vernon DT, Blake Rl. Does problem-based learning work? A meta-analysis of evaluative research. Acad Med. 1993;68(7):550-563. 6. Dochy F, Segers M, Van den Bossche P, Gijbels

D. Effects of problem-based learning: A meta-analysis. Learn Instr. 2003;13(5):533-568. 7. Schmidt H, Dauphinee WD, Patel Vl.

Comparing the effects of problem-based and conventional curricula in an international sample. Acad Med.1987;62:305-315. 8. Van den Berg M, Hofman W. Student

success in university education: A multi-measurement study of the impact of student and faculty factors on study progress. High Educ. 2005;50(3):413-446. 9. Schmidt HG, Cohen-schotanus J, Arends lR. Impact of problem-based, active learning on graduation rates for 10 generations of Dutch medical students.

Med Educ. 2009;43(3):211-218.

10. Schmidt HG, Cohen-Schotanus J, Van Der Molen, et al. learning more by

being taught less: a “time-for-self-study” theory explaining curricular effects on graduation rate and study duration. High

Educ. 2010;60(3):287-300.

11. Norman G. The birth and death of curricula.

Adv Health Sci Educ. 2017;22:797.

12. Prideaux D. Researching the outcomes of educational interventions: a matter of design. RTCs have important limitations in evaluating educational interventions.

BMJ. 2002;324(7330):126-127.

13. Norman GR, Schmidt HG. Effectiveness of problem‐based learning curricula: Theory, practice and paper darts. Med

Educ. 2000;34(9):721-728.

14. Kaufman DM, Mann KV. Achievement of students in a conventional and problem-based learning (PBl) curriculum. Adv

Health Sci Educ.1999;4(3):245-260.

15. Mennin SP, Friedman M, Skipper B, Kalishman S, Snyder J. Performances on the NBME I, II, and III by medical students in the problem-based learning and conventional tracks at the University of New Mexico. Acad

Med. 1993;68(8):616-624.

16. Kaufman DM, Mann KV. Comparing students’ attitudes in problem-based and conventional curricula. Acad

Med. 1996;71(10):1096-1099.

17. Schmidt H. Innovative and conventional curricula compared: What can be said about their effects? In: Nooman ZH, Schmidt HG, Ezzat ES (Eds.). Innovation in medical education: An evaluation of its present status. New York: Springer 1990. p.1-7. 18. Gordon M, Darbyshire D, Baker P. Non‐

technical skills training to enhance patient safety: a systematic review. Med

(21)

CHAPTER 1

20

19. Odell M. Human factors and patient safety: changing roles in critical care. Aust

Crit Care. 2011;24(4):215-217.

20. Papadakis MA, Teherani A, Banach MA, Knettler TR, Rattner Sl, Stern DT, et al. Disciplinary action by medical boards and prior behavior in medical school. N Engl J

Med. 2005;353(25):2673-2682.

21. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies.

Med Teach. 2007;29(7):642-647.

22. Ellaway R, Evans P, Mckillop J, Cameron H, Morrison J, Mckenzie H, et al. Cross-referencing the Scottish Doctor and Tomorrow’s Doctors learning outcome frameworks. Med

Teach. 2007;29(7):630-635.

23. McGaghie WC. Mastery learning: It Is Time for Medical Education to Join the 21st Century.

Acad Med. 2015;90(11):1438-1441.

24. Prideaux D. ABC of learning and teaching in medicine. Curriculum design.

BMJ. 2003;326(7383):268-270.

25. Prideaux D. Curriculum development in medical education: from acronyms to dynamism. Teach

Teach Educ. 2007;23(3):294-302.

26. Carpenter SK, Cepeda NJ, Rohrer D, Kang SHK, Pashler H. Using Spacing to Enhance Diverse Forms of learning: Review of Recent Research and Implications for Instruction.

Educ Psychol Rev. 2012;24(3):369-378.

27. Carpenter SK. Spacing and interleaving of study and practice. In: Benassi VA, Overson CE, Hakala CM (Eds.). Washington, DC, US: Society for the Teaching of Psychology; 2014. p. 131-141.

28. Severiens SE, Schmidt HG. Academic and social integration and study progress in problem based learning. High Educ. 2009;58(1):59. 29. Barrows HS. A taxonomy of problem‐

based learning methods. Med

Educ. 1986;20(6):481-486.

30. Kerdijk W, Snoek JW, van Hell EA, Cohen-Schotanus J. The effect of implementing undergraduate competency-based medical education on students’ knowledge acquisition, clinical performance and perceived preparedness for practice: a comparative study. BMC Med Educ. 2013;13(1):76. 31. Salmoni AW, Schmidt RA, Walter CB.

Knowledge of results and motor learning: a review and critical reappraisal. Psychol

Bull. 1984;95(3):355.

32. Wulf G, Shea C, lewthwaite R. Motor skill learning and performance: a review of influential factors. Med

Educ. 2010;44(1):75-84.

33. Wulf G. Attentional focus and motor learning: A review of 10 years of research. E-journal

Bewegung und Training. 2007;1(2-3):1-11.

34. Wulf G, Su J. An external focus of attention enhances golf shot accuracy in beginners and experts. Res Q Exerc

Sport. 2007;78(4):384-389.

35. Wulf G, McConnel N, Gärtner M, Schwarz A. Enhancing the learning of sport skills through external-focus feedback. J Mot

Behav. 2002;34(2):171-182.

36. Shea CH, Wulf G. Enhancing motor learning through external-focus instructions and feedback. Hum Mov Sci. 1999;18(4):553-571. 37. Hatala R, Cook DA, Zendejas B, Hamstra SJ, Brydges R. Feedback for simulation-based procedural skills training: a meta-analysis and critical narrative synthesis.

Adv Health Sci Educ. 2014;19(2):251-272.

38. Wulf G, Shea CH. Principles derived from the study of simple skills do not generalize to complex skill learning. Psychon Bull

Rev. 2002;9(2):185-211.

39. Wood T. Assessment not only drives learning, it may also help learning. Med

Educ. 2009;43(1):5-6.

40. Redfield Dl, Rousseau EW. A meta-analysis of experimental research on

(22)

GENERAl INTRODUCTION

21

teacher questioning behavior. Rev Educ

Res. 1981;51(2):237-245.

41. Jensen Jl, McDaniel MA, Woodard SM, Kummer TA. Teaching to the test… or testing to teach: exams requiring higher order thinking skills encourage greater conceptual understanding. Educ Psychol

Rev. 2014;26(2):307-329.

42. Anderson lW, Krathwohl DR, Bloom BS. A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: longman. 2001.

43. Muijtjens AMM, Van Mameren H, Hoogenboom RJI, Evers JlH, Van der Vleuten CPM. The effect of a “don’t know” option on test scores: number-right and formula scoring compared. Med

Educ. 1999;33:267–75.

44. Bloom BS. Taxonomy of Educational Objectives: The Classification of Education Goals. Cognitive Domain. Handbook 1. New York: longman, 1956.

45. Krathwohl DR. A revision of Bloom’s taxonomy: An overview. Theor Prac. 2002;41(4):212-218. 46. Crowe A, Dirks C, Wenderoth MP. Biology in bloom: implementing Bloom’s Taxonomy to enhance student learning in biology.

CBE Life Sci Educ. 2008;7(4):368-381.

47. Bissell AN, lemons PP. A new method for assessing critical thinking in the classroom.

BioScience. 2006;56(1):66-72.

48. Zoller U. Are lecture and learning compatible? Maybe for lOCS: Unlikely for HOCS. J Chem Educ. 1993;70(3):195. 49. Keislar ER. Test Instructions and Scoring

Method in True-False Tests. J Exp

Educ. 1953;21(3):243-249.

50. Traub RE, Hambleton RK, Singh B. Effects of Promised Reward and Threatened Penalty on Performance of a Multiple-Choice Vocabulary Test. Educ Psychol

Meas. 1969;29(4):847-861.

51. Bree K, Roman B. Teaching Students to Say “I Don’t Know”: Potential Methods and Implications. Acad Psychiatry. 2017;41:561. 52. Koriat A, Sheffer l, Ma’ayan H. Comparing objective and subjective learning curves: judgments of learning exhibit increased underconfidence with practice. J Exp

Psychol. 2002;131(2):147.

53. Schleifer ll, Dull RB. Metacognition and performance in the accounting classroom.

Issues in Account Educ. 2009;24(3):339-367.

54. Wrigley W, Van der Vleuten CPM, Freeman A, Muijtjens A. A systemic framework for the progress test: strengths, constraints and issues: AMEE guide no. 71. Med

Teach. 2012;31:683–97.

55. Schuwirth lWT, Van der Vleuten CPM. The use of progress testing. Perspect Med

Educ. 2012;1(1):24–30.

56. Tio RA, Schutte B, Meiboom AA, Greidanus J, Dubois EA, Bremers AJ. The progress test of medicine: the Dutch experience.

Perspect Med Educ. 2016;5(1):51-55.

57. Kirkpatrick D. Great ideas revisited: revisiting Kirkpatrick’s four-level model. Training and

Development. 1996;50(1):54-59.

58. Muijtjens AMM, Schuwirth lW, Cohen‐ Schotanus J, Thoben AJ, Van der Vleuten, CPM. Benchmarking by cross‐institutional comparison of student achievement in a progress test. Med Educ. 2008;42(1):82-88. 59. Muijtjens AM, Schuwirth lW,

Cohen-Schotanus J, Van der Vleuten CPM. Differences in knowledge development exposed by multi-curricular progress test data. Adv Health Sci

Educ. 2008;13(5):593-605.

60. Brunk I, Schauber S, Georg W. Do they know too little? An inter-institutional study on the anatomical knowledge of upper-year medical students based on multiple choice questions of a progress test. Ann Anat. 2017;209:93-100.

(23)

CHAPTER 1

22

61. Norman G, Neville A, Blake JM, Mueller B. Assessment steers learning down the right road: impact of progress testing on licensing examination performance.

(24)
(25)
(26)

the impact of curriculum design

in the acquisition of Knowledge

of oncology: comparison among

four medical schools

Dario Cecilio-Fernandes*, Wytze S. Aalders*, André J.A. Bremers, René A. Tio, Jakob de Vries

*both authors contributed equally

(27)

CHAPTER 2

26

abstract

Introduction

Over the past five years, cancer has replaced coronary heart disease as the leading cause of death in the Netherlands. It is thus paramount that medical doctors acquire a knowledge of cancer, since most of them will face many patients with cancer. Studies, however, have indicated that there is a deficit in knowledge of oncology among medical students, which may be due not only to the content but also to the structure of the curriculum. In this study, we compared students’ knowledge acquisition in four different undergraduate medical programs. Further, we investigated possible factors that might influence students’ knowledge growth as related to oncology.

Methods

The participants comprised 1440 medical students distributed over four universities in the Netherlands. To measure students’ knowledge of oncology, we used their progress test results from 2007 to 2013. The progress test consists of 200 multiple-choice questions; this test is taken simultaneously four times a year by all students. All questions regarding oncology were selected. We first compared the growth of knowledge of oncology using mixed models. Then, we interviewed the oncology coordinator of each university to arrive at a better insight of each curriculum.

Results

Two schools showed similar patterns of knowledge growth, with a slight decrease in the growth rate for one of them in year 6. The third school had a faster initial growth with a faster decrease over time compared to other medical schools. The fourth school showed a steep decrease in knowledge growth during years 5 and 6. The interviews showed that the two higher-scoring schools had a more focused semester on oncology, whereas in the others oncology was scattered throughout the curriculum. Furthermore, the absence of a pre-internship training program seemed to hinder knowledge growth in one school.

Discussion

Our findings suggest that curricula have an influence on students’ knowledge acquisition. A focused semester on oncology and a pre-internship preparatory training program are likely to have a positive impact on students’ progress in terms of knowledge of oncology.

(28)

THE IMPACT OF CURRICUlUM DESIGN IN THE ACQUISITION OF KNOWlEDGE OF ONCOlOGY

27

introduction

Cancer has been the leading cause of death in the Netherlands since 2009. In 2014, 105,000 Dutch people were diagnosed, and approximately 44,808 died because of cancer.1 The prevalence of cancer in the Netherlands shows an annual increase of 1-3%.2

Most physicians will encounter many oncology cases throughout their careers. Thus, physicians should be well prepared for the complex diagnosis, long-term treatment, and consequences of cancer.

Nowadays all medical schools in the Netherlands have a partial or full problem-based teaching curriculum. In a problem-problem-based curriculum, students gain experience in approaching (clinical) problems in an integrated way, taking into account prevention, diagnostics, and the competency to use scientific literature in their treatment strategy. This problem-oriented style of teaching could work very well for oncology, due to its often very complex diagnoses and rapidly changing treatment options.3 However, several

authors found a knowledge deficiency in oncology among first- and second-year students in problem-based teaching universities in the USA and Australia.4-8

A deficit in knowledge could eventually cause harm to a patient. This knowledge deficit may have various causes, one of which is the way oncology is taught and how it is distributed throughout the curriculum. To better understand the influence of curriculum design, we investigated its effects on the growth of students’ knowledge of cancer in different curricula. Additionally, we compared the growth of students’ knowledge of oncology to that of other medical subjects. Subsequently, we investigated factors that might influence students’ knowledge growth by consulting the curriculum coordinators.

methods

To answer our research question, we analyzed progress test data from four different schools using a mixed methods model, more specifically an explanatory sequential design. We used the progress test results of all the eligible students from the universities of leiden, Groningen, Nijmegen, and Maastricht, who started medical school in September 2007. Subsequently, we interviewed the oncology course directors of each school and examined the universities’ syllabi for each curriculum in order to better understand the differences among the curricula and to explain our quantitative findings.

The progress test

The progress test (PT) consists of four quarterly tests of 200 multiple-choice questions that measure students’ knowledge at graduate level. The questions are based on the Dutch National Blueprint for Medical Education. Each consecutive test includes 200 new items. All schools adhering to the Dutch National Blueprint for Medical Education contribute to the provision of test items in all categories, which are then reviewed by a committee. Thus, none of the students benefit disproportionately, because the writers of the items

(29)

CHAPTER 2

28

are equally distributed over the individual schools. The progress test is taken by all medical students of the four Dutch medical schools at the same time.9

Course information

All four medical schools were approached to provide us with their syllabi for curricula implemented in 2007-2013. We examined syllabi for all progams for the presence and degree of problem-based teaching using the following characteristics described by Hmelo-Silver (2004):10 (a) the use of problems as the starting point for learning, (b)

small-group collaboration with flexible guidance of a tutor, and (c) student-initiated learning. Furthermore, we interviewed the course directors in order to estimate the number of contact hours for pathology and oncology, and how they were positioned in the curriculum.

Data Analysis

As a specific oncology category was not defined within the progress test, all of the 4800 questions used (over 24 test iterations) were reviewed and classified by WA, based upon the presence or absence of oncology content. A random sample of 50 of the questions, with presence or absence of oncology knowledge, were then independently reviewed by RT and JdV. Since there was an agreement on all questions about the presence or absence of oncology knowledge, it was not deemed necessary to further investigate the inter-rater reliability.

Next, we collected all individual test scores and used these to calculate the average of correct answers for each of the 24 test moments on oncology items over time. Because these single-point values were proven to be unreliable in terms of showing knowledge level in previous research,11 we analyzed the data using mixed-effect models. This method

calculates the mathematical function that best explains the data.12 Additionally, growth

curves for oncology items were compared to the curve of the remaining items to approach whether knowledge of oncology exceeds overall knowledge or not.

results

Data from all 1440 medical students were retrieved. From those, 321 were from leiden, 313 from Maastricht, 485 from Groningen, and 321 from Nijmegen, all of which had been admitted through the same, centralized, admission procedure. The results show that progress tests have an average of 7.2% questions concerning oncology. At the start of their studies, students, on average, answered 8% of the oncology questions correctly, whereas at the end of the program they correctly answered 54% of the oncology questions, on average. At the end point, the percentage of correctly answered questions ranged from 51% to 59%.

When comparing the growth curves of knowledge of oncology among universities (Figure 1), we found similarities in both linear and quadratic growth models in schools A,

(30)

THE IMPACT OF CURRICUlUM DESIGN IN THE ACQUISITION OF KNOWlEDGE OF ONCOlOGY

29

B, and D. At the beginning, students had the same knowledge level, and they acquired knowledge at the same pace. In the last year, schools A and B showed a faster decrease in the growth rate of knowledge of oncology compared to schools C and D. Although school B had the fastest initial growth, this rate of growth decreased in the last practical phase of the program, resulting in the lowest end point. School C had a faster initial growth, with a slower decrease in years 3 and 4, and faster growth in years 5 and 6, when compared to the other schools, thus resulting in the highest final level of knowledge of all schools. Differences found in the four curricula studied, based on their syllabi, are displayed in Table 1. All schools used small tutor groups to support problem-based teaching. Whereas schools B and C showed a teacher-initiated approach, schools A and D showed a problem-based student-initiated approach. In school C, most lectures did not have a clinical problem as a starting point but had the highest number of contact hours in oncology and pathology. Pathology was handled in years 1 and 2 in all cases, and was followed by oncology in years 2 and 3. School B had training in oncology solely in year 2. Two schools (C and D) concentrated oncology in one semester, whereas schools A and B combined oncology with other medical subjects in one semester. In three schools, oncology was included in the subjects taught during the fourth year, a pre-internship training program. In school B, we found an absence of pre-internship training-program weeks, meaning that students commenced their internships directly in the fourth year. Schools A and B had the fewest contact hours, followed by school D. School C had more contact hours than the other three medical schools.

(31)

CHAPTER 2

30

Figure 2 depicts the growth curves of knowledge of oncology compared to overall knowledge, calculated within medical schools. In all groups, there was a significant difference in linear, quadratic, and cubic growth between knowledge of oncology and overall knowledge. At the end points, schools C and D scored higher on oncology, whereas schools A and B scored lower on oncology compared to overall knowledge. However, all schools scored lower on oncology throughout most of the curricula.

Table 1. Characteristics of four Dutch medical curricula. Degree of problem-based teaching Oncology concentrated in one semester Pre-internship training in oncology Estimated contact hours for pathology Estimated contact hours for oncology School A ++ - + 24 48 School B + - - 32 40 School C +/- + + 48 56 School D ++ + + 32 48

(32)

THE IMPACT OF CURRICUlUM DESIGN IN THE ACQUISITION OF KNOWlEDGE OF ONCOlOGY

31

discussion

In this study, we investigated the influence of four different curricula on students’ acquisition of knowledge of oncology. As expected, the curriculum has an impact on how students acquire their knowledge of oncology. It seems that concentrating the teaching of oncology in one semester is a more important factor in terms of knowledge acquisition than the type of teaching method. Similar results were found in a previous study, in which students who had concentrated oncology block-training performed better than students who had oncology training throughout the curriculum.13

Two major curriculum design differences might have influenced students’ knowledge growth. First, the type of teaching styles: Schools A and D used a problem-based learning approach, whereas schools B and C use a mix of traditional and problem-based learning approaches. One would imagine that being required to solve cases would facilitate students’ acquisition of a knowledge of oncology. It seems, however, that the type of teaching style plays no significant role, since our results demonstrate no clear difference in terms of teaching styles. Second, schools C and D had more contact hours in pathology and oncology than schools A and B did. A closer look, however, shows that the number of contact hours for teaching oncology in schools A and D were the same, while school A had more contact hours for pathology than school D. This would suggest that the number of contact hours involving basic knowledge (pathology) might lead to a better acquisition of more complex knowledge (oncology). Based on our findings, the number of contact hours may have contributed to students’ growth in knowledge about oncology.

Schools A and B scored lower on oncology questions at the end point compared to schools C and D, and also when compared to overall knowledge. In addition to the absence of concentrated teaching of oncology, schools A and B also taught oncology integrated with other medical subjects. Furthermore, school A had a pre-internship training program, whereas school B did not. This may explain why school B showed a faster decrease in knowledge of oncology. A pre-internship training program would allow students to refresh their knowledge, which would lead to better retention. Alternatively, an explanation could be that some students have less contact with oncology cases during internships than others.14

Schools C and D scored higher on oncology questions at end point compared to the other two schools and also compared to overall knowledge. Although their teaching styles differed considerably, schools C and D both taught oncology in a concentrated block of four weeks, instead of dispersing it among other medical subjects. Whereas students who have oncology spread out may not yet have acquired the knowledge necessary to fully understand more complex cases, students who have a block of oncology teaching may have already acquired the necessary knowledge. This is in line with previous research showing that medical students need to acquire basic factual knowledge before they can apply it.15 For instance, a concentrated oncology block could focus more on disease

(33)

CHAPTER 2

32

to other medical subjects, students must learn the importance of the staging system and its implications for the prognosis. Focusing on an “oncological way of thinking,” involving basic knowledge of oncology and its application in a concentrated oncology block, might be more effective in teaching oncology.

Our study has a few limitations. The retrospective character of this study does not allow us to control for variables that might have influenced our findings. However, it offers a unique opportunity to look closely at the real-life situation, which is often different from laboratory studies. The use of the Dutch progress test results may be a limitation since it measures students’ knowledge at end level. However, the use of the Dutch progress test, being a summative test, eliminates the bias of students’ willingness to participate, and it is possible to investigate students’ knowledge growth throughout their undergraduate training with a minimum of confounding through variation between tests.

conclusions

Based on our findings, we conclude that more contact hours, a focused semester on oncology, and a pre-internship preparatory training program are likely to have a positive impact on students’ progression in knowledge of oncology.

(34)

THE IMPACT OF CURRICUlUM DESIGN IN THE ACQUISITION OF KNOWlEDGE OF ONCOlOGY

33

references

1. Centraal Bureau voor de Statistiek. Deceased; cause of death, age, sex. Available at: http://statline.cbs.nl/ statweb/publication/ ?dm=slnl&pa= 7233&d1= 173& d2=0 &d3= 0&d4= 16-18&hdr= g2,g1,g3&stb= t&vw=t. 2. The Netherlands Cancer Registry. Available

at: www.cijfersoverkanker.nl.

3. Abacioglu U, Sarikaya O, Iskit S, Sengoz M. Integration of a problem-based multidisciplinary clinical cancer management course into undergraduate education.  J

Cancer Educ. 2004;19(3):144-148.

4. Boehler M, Advani V, Schwind CJ, et al. Knowledge and attitudes regarding colorectal cancer screening among medical students: A tale of two schools. J Cancer

Educ. 2011;26(1):147-152.

5. Mohyuddin N, langerman A, leHew C, Kaste l, Pytynia K. Knowledge of head and neck cancer among medical students at 2 Chicago universities. Arch Otolaryngol

Head Neck Surg. 2008;134(12):1294-1298.

6. Villarreal-Garza C, García-Aceituno l, Villa AR, Perfecto-Arroyo M, Rojas-Flores M, león-Rodríguez E. Knowledge about cancer screening among medical students and internal medicine residents in Mexico City. J Cancer Educ. 2010;25(4):624-631. 7. Geller AC, Prout M, Sun T, lew RA, Culbert

Al, Koh HK. Medical students’ knowledge, attitudes, skills, and practices of cancer prevention and detection. J Cancer

Educ. 1999;14(2): 72-77.

8. McRae RJ. Oncology education in medical schools: Towards an approach that reflects Australia’s health care needs. J Cancer

Educ. 2016;31(4):621-625.

9. Tio RA, Schutte B, Meiboom AA, Greidanus J, Dubois EA, Bremers AJ. The progress test of medicine: The Dutch experience.

Perspect Med Educ. 2016;5(1):51-55.

10. Hmelo-Silver CE. 2004. Problem-based learning: What and how do students learn?

Educ Psychol Rev. 2004;16(3):235-266.

11. Verhoeven BH, Verwijnen GM, Scherpbier AJJA, Van der Vleuten CPM. Growth of medical knowledge. Med Educ. 2002;36(8):711-717. 12. Shek DT, Ma C. longitudinal data analyses

using linear mixed models in SPSS: Concepts, procedures and illustrations.

ScientificWorldJournal. 2011;11:42-76.

13. Cecilio-Fernandes D, Aalders W, de Vries J, Tio RA. The Impact of Massed and Spaced-Out Curriculum in Oncology Knowledge Acquisition. J Cancer Educ. 2017:1-4. 14. Raghoebar-Krieger HM, Sleijfer DT,

Hofstee WK, Kreeftenberg HG, Bender W. The availability of diseases for medical students in a university hospital. Med

Teach. 2001;23(3):258-262.

15. Cecilio-Fernandes D, Kerdijk W, Jaarsma D, Tio RA. Development of cognitive processing and judgments of knowledge in medical students: Analysis of progress test results.

(35)
(36)

the impact of massed and

spaced-out curriculum in

oncology Knowledge acquisition

Dario Cecilio-Fernandes, Wytze S. Aalders, René A. Tio, Jakob de Vries

(37)

CHAPTER 3

36

abstract

Introduction

Starting in 2009, cancer has been the leading cause of death in the Netherlands. Oncology is therefore an important part of the medical curriculum in undergraduate education. It is crucial that medical students know about cancer, since doctors will encounter many cases of oncology. We have compared the influence that teaching oncology has when spread over a three-year curriculum versus concentrated in one semester.

Methods

The participants comprised 525 medical students from one medical school with comprehensive integrated curricula. Of those, 436 followed the massed curriculum, with oncology concentrated in one semester. The remaining 89 students followed a spaced-out curriculum, in which oncology was spread spaced-out over three years. To measure students’ knowledge, we used their progress test results from 2009 to 2012. All questions about oncology were categorized and selected. Because of our unbalanced sample and missing data and to reduce the chances for a Type II error, we compared the growth of oncology questions using mixed effect models.

Results

A cubic growth model with an unstructured covariance matrix fitted our data best. At the start, students in the spaced-out curriculum scored higher on oncology questions. The initial growth was faster for the spaced-out curriculum students, whereas the acceleration over time was slower compared to the massed curriculum students. At the end of the growth curve, the knowledge of the massed curriculum students increased faster. In the last test, the massed curriculum students outperformed those in the spaced-out curriculum.

Discussion

The way students acquired and applied their knowledge was similar in both curricula. It seems, however, that students benefitted more from massed than spaced-out education, which may be due to the comprehensive integrated teaching involved.

(38)

THE IMPACT OF MASSED AND SPACED-OUT CURRICUlUM

37

introduction

Ensuring that medical students acquire and retain knowledge of oncology during their medical education is essential for excellent oncological care later on. Most doctors will face many oncological patients during their practice, regardless of their specialization. Potosky and colleagues (2014) conducted a survey of US physicians showing that many physicians, without a specialization in oncology, lack critical knowledge and education on this topic.1 Moreover, these physicians lacked confidence in their knowledge of cancer.

Similar results were found in undergraduate medical students.2 To address this problem,

several curricula for residency education were developed to provide structure, content, and guidance in teaching oncology.3 Furthermore, integrated holistic approaches

are increasingly being implemented in undergraduate oncology education.4 Despite

the number of curricula that have emerged, the efficacy of oncology education remains unclear.3 In the Netherlands, cancer has been the leading cause of death since 2009. Since

then, oncology has played an important role in the medical curriculum in undergraduate education. It is crucial that medical students acquire and retain knowledge of oncology during their preclinical phase, so they can apply that knowledge when face to face with a patient.

One way of improving students’ knowledge retention is by spacing and repeating the learning material throughout medical school. Spacing the study sessions improves long-term retention as compared to massed practice.5-6 In this study, we compared

the influence of teaching oncology spread over a three-year Bachelor’s phase with what is known as massed presentation (with most of the study material concentrated in one semester). Both curricula covered the same content, and both were used in the same medical school. We hypothesized that the spaced-out curriculum would prove more beneficial for students’ learning and retention, since students would re-study the learned topic throughout the Bachelor’s phase.

methods

Setting

Since 2009, the University Medical Center Groningen (UMCG) has offered an international Bachelor’s degree program in medicine in parallel with a national Bachelor’s program. Both programs have the same learning goals, content, material, and teaching methods (PBl), but the order of disciplines is as different as the language used. At this stage, that is, the Bachelor’s phase, students do not have individual contact with real patients. During the patient lectures, patients are carefully selected. Most of these patients speak basic English, although in some cases the lecturer translates the answers given by the patient.

In the national track, the program is taught in Dutch, whereas in the international track the program is taught in English.

(39)

CHAPTER 3

38

Both groups have the same admission requirements. All international students take a proficiency in English test (IElTS) to ensure that they are proficient in English, unless they are native English speakers. Moreover, students need to show proof of their level of science education. If they fail to do so, they are allowed to attend one year of pre-university education. The proficiency of both tracks is comparable in terms of their knowledge, and they are both regulated by the same rules and cut-off scores.

Oncology Education

Both curricula follow a comprehensive integrated curriculum, which is structured and part of the curriculum design. The content, learning objectives, material, number of hours and the teachers are the same for the national and international track. In addition to language, the only difference is that, in the national track, oncology is condensed into one semester at the beginning of the third year, whereas in the international track it is spread out over the three Bachelor’s years.

Throughout the Bachelor’s phase, medical students in both curricula, in addition to being exposed to patient cases during patient lectures, encounter patient problems in the form of assigments on paper. These patient cases might contain topics related to oncology. Students from both curricula encounter similar patient cases. Since in the spaced-out curriculum, students have various ontological topics spread out over the entire Bachelor’s phase, they may have acquired more knowledge about oncology as compared to those students in the massed curriculum.

Progress Test

To measure students’ knowledge of oncology, we used their progress test results from 2009 to 2012. The Dutch progress test is based on the Dutch National Blueprint for the Medical Curriculum, and it aims to assess knowledge at the end of the curriculum level. The Dutch progress test is administered four times a year, and each test contains 200 multiple-choice questions on all subjects, including oncology. Furthermore, the questions can be divided into vignette questions, in which a patient case is presented, and knowledge questions without any patient cases. The questions differ for each test; the level of difficulty, however, is the same for all progress tests.

The progress test has a formative and a summative format. Students cannot fail because of one test or one subject. After completing the progress test, students receive feedback per discipline, which compares their mean with the overall mean of their reference group, and which is thus an indication of their performance. In addition, they receive a fail, pass, or good grade. Over a one-year period, students are required to pass three of the four progress tests, otherwise they will have to repeat the progress test as a whole, and not necessarily just the block that they failed (for more information about the Dutch progress test, see Tio et al.7).

The national track takes the test in Dutch, whereas the international track takes the test in English. The English test is translated by an official certified translator (native

Referenties

GERELATEERDE DOCUMENTEN

Ana Moreira Universidade Nova de Lisboa, Portugal Haris Mouratidis University of East London, UK John Mylopoulos University of Toronto, Canada Cornelius Ncube

For each condition, the sensor signals (acceleration, angular velocity, force, and moment), reconstructed linear and rotational movements, trans- lational, rotational, and total

Financial support for the publication of this dissertation was kindly provided by research institute SHARE, University Medical Center Groningen, and University of Groningen.

In Chapter 2, the relation was investigated between students’ knowledge development of oncology and various aspects of the curriculum, such as the degree of problem-based

Based on our findings, we conclude that more contact hours, a focused semester on oncology, and a pre-internship preparatory training program are likely to have a positive impact

Since in the spaced-out curriculum, students have various ontological topics spread out over the entire Bachelor’s phase, they may have acquired more knowledge about oncology as

In year 3, the percentage of correct answers to both type of questions increased and the percentage of correct answers to simple questions was higher than that for

Although formula-scoring method tests are not frequently used, except for progress tests in medicine, it gives students the opportunity to acknowledge that they do not know the