Serviceeinschränkungen vom 12.-22.02.2026 - weitere Infos auf der UB-Homepage

Treffer: Paired basic science and clinical problem-based learning faculty teaching side by side: do students evaluate them differently? / Intervenants scientifiques et spécialistes de l'apprentissage par résolution de problèmes en médecine enseignant côte à côte : les étudiants les évaluent-ils différemment ?

Title:
Paired basic science and clinical problem-based learning faculty teaching side by side: do students evaluate them differently? / Intervenants scientifiques et spécialistes de l'apprentissage par résolution de problèmes en médecine enseignant côte à côte : les étudiants les évaluent-ils différemment ?
Source:
Medical education (Oxford. Print). 39(2):194-201
Publisher Information:
Oxford: Blackwell, 2005.
Publication Year:
2005
Physical Description:
print, 15 ref
Original Material:
INIST-CNRS
Document Type:
Fachzeitschrift Article
File Description:
text
Language:
English
Author Affiliations:
Department of Medicine, University of California, Davis, California, United States
Department of Neurology, University of California, Davis, California, United States
Department of Pathology, University of California, Davis, California, United States
Department of Cell Biology and Human Anatomy, University of California, Davis, California, United States
ISSN:
0308-0110
Rights:
Copyright 2005 INIST-CNRS
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Notes:
Sciences of education

FRANCIS
Accession Number:
edscal.16633092
Database:
PASCAL Archive

Weitere Informationen

INTRODUCTION: Many studies have evaluated the desirability of expert versus non-expert facilitators in problem-based learning (PBL), but performance differences between basic science and clinical facilitators has been less studied. In a PBL course at our university, pairs of faculty facilitators (1 clinician, 1 basic scientist) were assigned to student groups to maximise integration of basic science with clinical science. AIMS: This study set out to establish whether students evaluate basic science and clinical faculty members differently when they teach side by side. METHODS: Online questionnaires were used to survey 188 students about their faculty facilitators immediately after they completed each of 3 serial PBL cases. Overall satisfaction was measured using a scale of 1-7 and yes/no responses were gathered from closed questions describing faculty performance. RESULTS: Year 1 students rated basic science and clinical facilitators the same, but Year 2 students rated the clinicians higher overall. Year 1 students rated basic scientists higher in their ability to understand the limits of their own knowledge. Year 2 students rated the clinicians higher in several content expertise-linked areas: preparedness, promotion of in-depth understanding, and ability to focus the group, and down-rated the basic scientists for demonstrating overspecialised knowledge. Students' overall ratings of individual faculty best correlated with the qualities of stimulation, focus and preparedness, but not with overspecialisation, excessive interjection of the faculty member's own opinions, and encouragement of psychosocial issue discussion. CONCLUSION: When taught by paired basic science and clinical PBL facilitators, students in Year 1 rated basic science and clinical PBL faculty equally, while Year 2 students rated clinicians more highly overall. The Year 2 difference may be explained by perceived differences in content expertise.

AN0015862481;esf01feb.05;2019Jun04.08:21;v2.2.500

Paired basic science and clinical problem-based learning faculty teaching side by side: do students evaluate them differently? 

Introduction  Many studies have evaluated the desirability of expert versus non‐expert facilitators in problem‐based learning (PBL), but performance differences between basic science and clinical facilitators has been less studied. In a PBL course at our university, pairs of faculty facilitators (1 clinician, 1 basic scientist) were assigned to student groups to maximise integration of basic science with clinical science. Aims  This study set out to establish whether students evaluate basic science and clinical faculty members differently when they teach side by side. Methods  Online questionnaires were used to survey 188 students about their faculty facilitators immediately after they completed each of 3 serial PBL cases. Overall satisfaction was measured using a scale of 1−7 and yes/no responses were gathered from closed questions describing faculty performance. Results  Year 1 students rated basic science and clinical facilitators the same, but Year 2 students rated the clinicians higher overall. Year 1 students rated basic scientists higher in their ability to understand the limits of their own knowledge. Year 2 students rated the clinicians higher in several content expertise‐linked areas: preparedness, promotion of in‐depth understanding, and ability to focus the group, and down‐rated the basic scientists for demonstrating overspecialised knowledge. Students' overall ratings of individual faculty best correlated with the qualities of stimulation, focus and preparedness, but not with overspecialisation, excessive interjection of the faculty member's own opinions, and encouragement of psychosocial issue discussion. Conclusion  When taught by paired basic science and clinical PBL facilitators, students in Year 1 rated basic science and clinical PBL faculty equally, while Year 2 students rated clinicians more highly overall. The Year 2 difference may be explained by perceived differences in content expertise.

Keywords: medical; undergraduate/*methods; problem‐based learning/ methods; science/ education; questionnaires; computer‐based instruction; students/ education; faculty/*standards; education

Problem‐based learning (PBL) is now used in many medical schools for a variety of educational goals. These include encouraging lifelong learning, building team communications, enhancing problem solving ability and hypothesis generation, and forming connections between basic, clinical and social sciences integral to medicine. The proliferation of this method has led to many variations in its execution. Cases vary from occupying 1 to 4 sessions, emphasising 1 to many clinical problems, being developed by centralised PBL experts or non‐expert course directors, and occurring weekly to quarterly. Some cases now include computerised clinical decision trees, videos or standardised patient interviews. Clearly, the method has expanded in both its definition and objectives from original reports, leading some to criticise the use of the term 'PBL' at many schools and to urge better definition of its use and specific goals.[[1]]

Despite these differences, most institutions using PBL do hold to common precepts, including student‐driven sessions with faculty acting as facilitators rather than lecturers or group leaders. The best selection and training methods to support faculty in this non‐traditional teaching role have been controversial, and selection of faculty remains highly variable from school to school.

A thorough and lively debate has occurred regarding the use of expert versus non‐expert facilitators, as recently reviewed.[3] Most studies have distinguished 'expert faculty' (faculty with content expertise in at least 1 aspect of the case) from 'non‐expert faculty' (faculty without such expertise). Even this distinction is problematic, firstly because a given case may contain multiple areas of inquiry, and secondly because 'expertise' is a relative term: for example, all faculty may be more expert than all Year 1 students in a given area of medicine. This debate remains unsettled, and viewpoints often depend on the specific purpose of PBL in a given curriculum or the outcome measured (specific knowledge gained versus attitudinal or process‐oriented outcomes).[[4]]

Our institution began using PBL within a lecture‐based curriculum in 1998, and fully implemented it in 2001–02. Unlike some schools, we chose to use it as an integrating supplement to a traditional curriculum rather than as the primary learning modality of the students. The standard curriculum at our university emphasises normal physiology, anatomy and biochemistry in Year 1, followed by pathophysiology and pharmacology in Year 2. The PBL cases are designed to complement and integrate traditional courses, such as Year 1 physiology, anatomy, neurosciences and biochemistry and Year 2 oncology, orthopaedics and microbiology.

One of our primary goals for these cases was to emphasise the interface between 'basic' and 'clinical' sciences for Year 1 and 2 students, for whom this concept was not obvious. To do this, we used paired faculty facilitators in all Year 1 and 2 PBL groups. One faculty member was a 'basic scientist', defined here as PhD faculty or a pathologist, and one a 'clinician', defined as all remaining MD faculty. No attempt was made to match faculty to case by area of content expertise, or to avoid such matches. All Year 1 and 2 students (92–94 per year) were enrolled. In 2001–02, students completed 2 (Year 2) or 3 (Year 1) PBL cases, each facilitated by a different set of paired faculty members. Group size varied from 7 to 9 students. Cases involved 3–4 sessions, each of which was 1.5–2 hours in length. Each case unfolded in 7–8 total handouts over the 3–4 sessions. Each session began with a review of the prior material and a discussion of the 'learning issues' generated during the previous session. Cases were written to reinforce or introduce new concepts tied to the ongoing curriculum that the students were taking, and integrated basic sciences, clinical sciences, behavioural sciences, ethics, economics and population medicine.

Given the unusual paired facilitator design of our PBL groups, we analysed student evaluations of their faculty in order to answer the following questions:

• 

Do students evaluate basic science and clinical PBL faculty differently when they teach side by side?

• 

Do any perceived differences change as the students progress in their clinical training?

• 

Do overall student evaluations of faculty correlate with specific characteristics of the teachers?

Methods

Study design and faculty selection

This study was carried out during the 2001–02 academic year and received an exemption from individual student informed consent from the Human Studies IRB Committee. The exemption was granted because the study performed a retrospective analysis of data from standardised anonymous course evaluations that all students complete each quarter as a routine part of our course evaluation process.

All faculty staff were regular faculty members (no students), trained in PBL technique using a single 3‐hour session involving mock sessions with undergraduate students, followed by direct performance feedback. The principles of PBL teaching were modified from materials used in PBL training workshops at the University of New Mexico. After initial training, the principles of PBL teaching were reviewed with faculty before each session began.

All faculty participants facilitated a single case during the year, except for 1 basic scientist, who facilitated 2 cases; his reported evaluation scores were a mean for both sessions. No groups had the same pair of faculty for more than a single case.

Survey instrument

Students were required to complete a web‐based evaluation at the end of each PBL case in order to receive a passing grade. This requirement is used routinely for all courses at our university. Students based their responses on the 3–4 contacts with the faculty member that occurred during each case. They were first asked for their overall evaluation of each faculty member: faculty were evaluated on a 1–7 Likert scale for overall performance, with labelled descriptors at each point of the scale ('outstanding' to 'inadequate'). Students were then asked a series of yes/no questions about each faculty member based on the specific expectations of faculty that had been published previously and described to both faculty and students during faculty training and student orientation (Table 1). The questions included both positive and negative wording, but no attempt was made to equalise these. Whilst all the questions reflected expectations for good faculty PBL technique and skill in the PBL process, questions 1–6 also evaluated faculty qualities related to their expertise or familiarity with specific content.

1 Students' responses to yes/no questions about basic science versus clinical PBL faculty facilitators. Percentages of student evaluations indicating 'yes' to each item

<table><thead valign="bottom"><tr><th valign="bottom">Question</th><th align="center">Year 1</th><th align="center">Year 2</th></tr><tr><th>Basic scientists (<italic>n</italic>&#8195;=&#8195;564)</th><th>Clinicians (<italic>n</italic>&#8195;=&#8195;564)</th><th>Basic scientists (<italic>n</italic>&#8195;=&#8195;384)</th><th>Clinicians (<italic>n</italic>&#8195;=&#8195;384)</th></tr></thead><tbody valign="top"><tr><td>&#8194;1</td><td>Came well prepared for the discussions</td><td>75.9</td><td>83.8</td><td>72.3*</td><td>88.4</td></tr><tr><td>&#8194;2</td><td>Promoted in&#8208;depth understanding of specific topics</td><td>53.2</td><td>62.8</td><td>48.7*</td><td>68.0</td></tr><tr><td>&#8194;3</td><td>Assisted in keeping the discussion properly focused</td><td>71.3</td><td>79.5</td><td>54.0*</td><td>75.6</td></tr><tr><td>&#8194;4</td><td>Was overly specialised in his or her knowledge and approach</td><td>9.5</td><td>9.5</td><td>15.6*</td><td>6.3</td></tr><tr><td>&#8194;5</td><td>Was honest about the limitations of his or her knowledge</td><td>77.0*</td><td>60.4</td><td>72.6</td><td>65.2</td></tr><tr><td>&#8194;6</td><td>Encouraged discussion of psychosocial/humanistic aspects of case</td><td>62.2</td><td>68.7</td><td>57.6</td><td>54.8</td></tr><tr><td>&#8194;7</td><td>Encouraged student participation through his or her questions</td><td>82.8</td><td>87.8</td><td>78.4</td><td>83.8</td></tr><tr><td>&#8194;8</td><td>Stimulated group&#8208;directed discussion and did not dominate</td><td>73.0</td><td>69.4</td><td>63.5</td><td>68.1</td></tr><tr><td>&#8194;9</td><td>Interjected his or her opinion more than is ideal</td><td>12.5</td><td>17.5</td><td>14.2</td><td>17.6</td></tr><tr><td>10</td><td>Listened attentively and followed discussion closely</td><td>88.1</td><td>87.2</td><td>79.4</td><td>80.0</td></tr><tr><td>11</td><td>Was patient, sensitive, and not excessively critical</td><td>85.1</td><td>79.0</td><td>75.7</td><td>67.9</td></tr><tr><td>12</td><td>Promoted open discussion of differing opinions</td><td>62.2</td><td>68.0</td><td>58.1</td><td>63.7</td></tr><tr><td>13</td><td>Worked collaboratively with his or her facilitator</td><td>54.4</td><td>58.8</td><td>58.9</td><td>60.7</td></tr><tr><td>14</td><td>Provided constructive comments on individual student performance</td><td>42.0</td><td>43.8</td><td>46.8</td><td>46.8</td></tr><tr><td>15</td><td>Provided constructive comments on the group's progress</td><td>65.3</td><td>66.1</td><td>71.1</td><td>71.9</td></tr></tbody></table>

1 *  P < 0.05 versus clinicians in same year.

2 n = number of evaluations.

Data analysis

Faculty demographic characteristics were compared using contingency tables and Fisher's exact test. Student survey data comparing basic scientists versus clinicians were analysed as follows. Comparisons between paired basic science and clinical faculty in the same year were made by paired t‐tests. For the overall evaluations, mean numerical ratings (1–7 scale) were compared. For yes/no questions evaluating faculty characteristics, the mean percentages of 'yes' responses were compared for each question. Statistical significance was reported when P < 0.05. Data are reported as means ± 95% confidence intervals.

For correlation analysis, the mean percentage of 'yes' responses on each question for each faculty member was correlated with that faculty member's overall numerical quality rating. After determining that the faculty 'yes' percentage distributions were Gaussian, correlation analysis was performed and correlations reported as r<sups>2</sups> values, with significance reported for P < 0.05. All calculations were performed using GraphPad Prism software (GraphPad, Inc., San Diego, CA, USA).

Results

A total of 939/948 surveys were completed online, giving a response rate of 99%. Year 1 comprised 94 students, each with 3 cases and 2 faculty per case, making a total of 564 evaluations. Year 2 comprised 96 students, each with 2 cases and 2 faculty per case, giving a total of 384 evaluations. Year 1 students evaluated 3 sets of facilitators (faculty n = 74 for all students), while Year 2 students evaluated 2 sets (faculty n = 46). Faculty characteristics are shown in Table 2. There were no statistically significant differences between any group in the characteristics listed, although there was a trend towards more women in each basic science group (reflecting the demographics of the faculty as a whole).

2 Characteristics of the 60 PBL faculty teaching in the 2001–02 academic year

<table><thead valign="bottom"><tr><th valign="bottom" /><th align="center">Year 1</th><th align="center">Year 2</th></tr><tr><th>Basic scientists</th><th>Clinicians</th><th>Basic scientists</th><th>Clinicians</th></tr></thead><tbody valign="top"><tr><td>Previous PBL experience</td><td>7/37 (19%)</td><td>9/37 (24%)</td><td>5/23 (22%)</td><td>4/23 (17%)</td></tr><tr><td>Women</td><td>12/37 (32%)</td><td>7/37 (19%)</td><td>6/23 (26%)</td><td>4/23 (17%)</td></tr><tr><td>MD Pathologists*</td><td>4/37 (11%)</td><td /><td>1/23 (4%)</td><td /></tr></tbody></table>

3 All comparisons between basic science and clinical faculty in a given year were non‐significant (P > 0.05). 4 *  As a percentage of basic scientists.

After completing the course, all faculty were rated by students using an online evaluation system. For overall ratings of PBL faculty quality, there was no statistical difference in Year 1 ratings of basic science versus clinical faculty, while Year 2 students rated the clinical faculty higher (Year 1: basic science 5.89 ± 0.10, clinical 6.04 ± 0.11; Year 2: basic science 5.69 ± 0.16, clinical 6.06 ± 0.14; P = 0.024) (Table 3).

3 Students' overall rating of PBL faculty

<table><thead valign="bottom"><tr><th /><th>Year 1 students (<italic>n</italic>&#8195;=&#8195;564 evaluations)</th><th>Year 2 students (<italic>n</italic>&#8195;=&#8195;384 evaluations)</th></tr></thead><tbody valign="top"><tr><td>Basic science faculty</td><td>5.89&#8195;&#177;&#8195;0.10 &#8232;(range 4.17&#8211;6.63)</td><td>5.69&#8195;&#177;&#8195;0.16 &#8232;(range 3.71&#8211;6.86)</td></tr><tr><td>Clinical faculty</td><td>6.04&#8195;&#177;&#8195;0.11 &#8232;(range 4.13&#8211;6.00)</td><td>6.06&#8195;&#177;&#8195;0.14*&#8232;(range 4.67&#8211;6.86)</td></tr></tbody></table>

5 *  P < 0.05 versus basic science faculty in Year 2.

To study further this difference between Year 1 and 2 student perceptions, we analysed their responses to a set of 15 yes/no questions about the faculty (Table 1). These questions were taken from course guidelines for PBL faculty given to the students and faculty facilitators before the course began. All questions evaluated expected PBL facilitation technique, but questions 1–6 also incorporated elements of faculty content familiarity and expertise. In Year 1 only 1 of the 15 questions showed a statistical difference between paired basic science and clinical faculty, mirroring the lack of difference in the overall rankings ('Was honest about the limitations of his or her knowledge': 77.0% yes for basic science faculty, 60.4% yes for clinical; P < 0.05). In contrast, Year 2 students responded differently about basic science faculty versus clinical faculty on 4 items (Table 1, questions 1–4). The Year 2 clinical faculty were seen as being better prepared and as promoting both more focused discussion and in‐depth understanding. Basic science faculty were also perceived as being more 'overspecialised' in their knowledge. Note that the Year 2 students found significant differences between basic science and clinical faculty in 4 of the 6 questions that assessed faculty content familiarity and expertise. In neither year did students note a difference between basic science and clinical faculty in other group process skills (listening, stimulation, patience, feedback).

In order to assess which of the specific evaluated faculty characteristics correlated most strongly with the students' overall ratings of the facilitators, correlation analysis was performed (Table 4). In this analysis, high r<sups>2</sups> indicated that a specific question response from students correlated well with the students' overall evaluation of that teacher. Student ratings in both years correlated well with questions assessing faculty preparation, stimulation/encouragement and promotion of group focus. Year 1 ratings also correlated well with promotion of 'in‐depth understanding', while Year 2 student ratings correlated well with 'Promoted open discussion of differing opinions'. In contrast, responses to 'Interjected his or her opinion more than is ideal' and 'Was overly specialised in his or her knowledge and approach' were not statistically related to overall perceptions of faculty quality, and 'Encouraged discussion of psychosocial/humanistic aspects' correlated poorly with overall faculty ratings.

4 Correlation of students' overall rating of a PBL faculty member with specific faculty characteristics (percentage responding 'yes' to question)

<table><thead valign="bottom"><tr><th valign="bottom">Question</th><th><italic>r</italic><sup>2</sup></th></tr><tr><th>Year 1</th><th>Year 2</th></tr><tr><th>(<italic>n</italic>&#8195;=&#8195;564)</th><th>(<italic>n</italic>&#8195;=&#8195;384)</th></tr></thead><tbody valign="top"><tr><td>&#8194;1</td><td>Came well prepared for the discussions</td><td>0.44*</td><td>0.50*</td></tr><tr><td>&#8194;2</td><td>Promoted in&#8208;depth understanding of specific topics</td><td>0.40*</td><td>0.27*</td></tr><tr><td>&#8194;3</td><td>Assisted in keeping the discussion properly focused</td><td>0.34*</td><td>0.33*</td></tr><tr><td>&#8194;4</td><td>Was <italic>overly</italic> specialised in his or her knowledge and approach</td><td>0.01</td><td>0.04</td></tr><tr><td>&#8194;5</td><td>Was honest about the limitations of his or her knowledge</td><td>0.07*</td><td>0.23*</td></tr><tr><td>&#8194;6</td><td>Encouraged discussion of psychosocial/humanistic aspects of case</td><td>0.10*</td><td>0.05</td></tr><tr><td>&#8194;7</td><td>Encouraged student participation through his or her questions</td><td>0.51*</td><td>0.34*</td></tr><tr><td>&#8194;8</td><td>Stimulated group&#8208;directed discussion and did not dominate</td><td>0.32*</td><td>0.34*</td></tr><tr><td>&#8194;9</td><td>Interjected his or her opinion more than is ideal</td><td>0.02</td><td>0.03</td></tr><tr><td>10</td><td>Listened attentively and followed discussion closely</td><td>0.09*</td><td>0.29*</td></tr><tr><td>11</td><td>Was patient, sensitive and not excessively critical</td><td>0.21*</td><td>0.16*</td></tr><tr><td>12</td><td>Promoted open discussion of differing opinions</td><td>0.25*</td><td>0.40*</td></tr><tr><td>13</td><td>Worked collaboratively with his or her facilitator</td><td>0.33*</td><td>0.25*</td></tr><tr><td>14</td><td>Provided constructive comments on individual student performance</td><td>0.26*</td><td>0.22*</td></tr><tr><td>15</td><td>Provided constructive comments on the group's progress</td><td>0.33*</td><td>0.22*</td></tr></tbody></table>

6 *  Coefficients significant at P < 0<bold>.</bold>05. Correlations for all questions were positive (i.e. Pearson r‐values > 0) for all questions except 4 and 9, which showed negative correlations. 7 n = number of evaluations.

Discussion

The PBL cases at our university use paired faculty facilitators, 1 from basic science and 1 from clinical medicine. We surveyed all Year 1 and 2 students after each PBL case in order to assess their opinions of their faculty facilitators, asking them a variety of questions about faculty skills and characteristics. The analysis of the student feedback was intended to assess the students' relative perceptions of basic science and clinical facilitators when they taught side by side. The results show that, when presented with paired PBL facilitators (1 clinical, 1 basic science), Year 1 students rated them equally, while Year 2 students rated clinical faculty more highly (Table 3). When analysed by specific faculty qualities, Year 2 students rated clinicians more highly for questions related to preparation, breadth of knowledge, and facilitation of group focus and in‐depth knowledge (Table 1). All these questions contained some evaluation of faculty content expertise. Year 1 students noted no such differences. In neither year were other faculty group process skills seen as differing between the basic science and clinical facilitators. These data could be interpreted as Year 2 students developing a preference for faculty (regardless of specialty) who were more skilled at content delivery than at classical PBL facilitation. However, this hypothesis is not borne out by the correlation analysis, which showed that overall student ratings of faculty correlated with both content‐related (questions 1–6) and non content‐related (questions 7–15) PBL group dynamic skills (Table 4).

Analysis of findings

The difference in ratings of basic science versus clinical faculty by Year 2 students was not large in magnitude, but was significant statistically. Given the great range of individual styles and expertise among individual faculty members, the difference is likely to be truly significant. Why did Year 2 students rate their clinical facilitators more highly than their basic scientist facilitators? Possible hypotheses are that:

• 1

clinicians were more skilled at PBL facilitation;

• 2

clinicians were more comfortable facilitating discussion of increasingly complex clinical issues in Year 2 cases;

• 3

students rated clinicians higher in Year 2 because of familiarity, as clinicians formed the bulk of Year 2 faculty;

• 4

by Year 2 students were disinterested in focusing on basic science topics and were more enthusiastic about clinical ones.

Were the clinicians more proficient in PBL facilitation technique? The data in Table 1 suggest that most aspects of PBL facilitation technique were similar in both groups of faculty. Year 2 students noted, however, that clinical facilitators were superior in terms of preparation and promotion of group focus and in‐depth understanding. They also felt that basic science facilitators were overly specialised in their knowledge. Thus, some of the observed differences may be explained by the students' perception of the differing knowledge bases of the basic science versus clinical faculty, rather than in PBL facilitation technique per se.

Were the basic science faculty actually more uncomfortable teaching in Year 2, where student curriculum and interest becomes more clinical? This has been suggested as a problem in a previous study[9] in which basic science and clinical faculty attitudes towards PBL teaching were reported. While this theme emerged subjectively in several basic scientists' comments after PBL sessions, we did not test it directly. Year 2 basic science facilitators did not differ from clinicians in their responses to questions addressing their enjoyment, willingness to return as teachers, or value of the experience for students (data not shown). These responses do not support a difference in faculty comfort level in teaching.

Did Year 2 students rate clinical faculty members more highly because of their familiarity with them? According to this hypothesis students might rate more positively the teachers familiar to them from the rest of their curriculum − basic scientists in Year 1, clinicians in Year 2. We did not test this directly, but several pieces of data weigh against it. Year 1 students did not rate their basic science faculty more highly, despite limited contact with clinicians. In addition, there was no difference in either year in student responses addressing facilitator openness, encouragement or patience. These findings do not directly support the notion of familiar faculty receiving higher ratings.

Did the ratings difference in Year 2 reflect less student interest in focusing on basic science topics versus clinical ones? We have no data to support or reject this hypothesis from our evaluations. We also do not know if basic science facilitators 'favoured' basic science topics, and clinical facilitators clinical ones. It remains as a possible explanation for the differences observed, and is one with considerable subjective support from many experienced medical school faculty. Parenthetically, given pre‐clinical students' well documented scepticism of the value of basic sciences in traditional curricula,[[10]] we were encouraged by the lack of student preference for clinical facilitators in Year 1, and the relatively small, if significant, difference seen in Year 2.

Limitations of the study

This study took place over 1 academic year, so the Year 1 and 2 students compared comprised different groups of students. However, admissions policy and leadership was the same in the 2 years, as were the male/female ratio and ethnic mix (data not shown). In addition, student evaluations of the faculty and course have remained stable over its history since 2001 (data not shown). Differences between the Year 1 and 2 groups could conceivably be explained by other factors, such as facilitator experience or self‐selection of more adept facilitators into Year 2. However, none of the measured faculty characteristics (percentage of women, percentage of experienced facilitators) statistically differed among groups (Table 2). This course was new in the 2001–02 academic year, and these faculty should mostly be considered novices regarding PBL experience. The conclusions reached may be different after several years of faculty experience. However, the percentage of novice faculty did not differ between basic science and clinical facilitators (Table 2).

As each faculty member facilitated only 1 case, it could be argued that the students had inadequate opportunity to fully evaluate faculty teaching skills and attitudes. However, in this study this critique would have applied equally to both basic science and clinical faculty and therefore should not have biased the results.

These data were gathered from a mandatory course evaluation questionnaire, a process that could be criticised on methodological or ethical grounds. Mandatory response surveys avoid the biases associated with volunteer participation and non‐participation (was there a reason some completed the survey and some did not?), as well as the poor sampling sizes typical of voluntary surveys (30–40% rates were typical at our university prior to mandatory completion). However, mandatory completion may cause other biases. Positive response bias (such as belief in individual reward or retribution causing more positive responses) was avoided in this study by the guarantee of anonymity, repeatedly discussed with students and maintained in all course evaluations. Negative response bias (resentment at having to fill out surveys, consciously or unconsciously causing more negative responses) is possible in these evaluations. The likelihood of this having biased our results is lessened by the use of routine evaluation questionnaires which were part of the normal course evaluation process, the wide spread of student responses (range 1–7 on a 7‐point scale for both basic science and clinical faculty in both Year 1 and 2 groups), the similarly wide range of faculty scores in all groups (Table 3), and the 99% response rate and large number of evaluations which may even out any positive or negative response bias. Regarding concerns about the ethics of mandatory questionnaires, we note that the local IRB did not require individual informed consent from students, because the data were gathered from anonymous standardised course evaluation forms, and the IRB felt that the study had no impact on student evaluation or performance.

It could be argued that MD pathologists should not be considered as 'basic scientists', as they have considerable clinical experience. Even if there is a difference in facilitation style, there was no statistical difference in the percentage of pathologists among the basic science pool between Years 1 and 2 (4/37 in Year 1, 1/23 in Year 2; Table 2). In addition, after observing 10–15 pathologists teaching, we subjectively found that pathologists' facilitation style and contributions were more similar to those of basic scientists than to those of most clinicians. This assertion is based on greater attentiveness to mechanism and scientific principles and less use of clinical vignettes.

Comparison with published literature

To our knowledge, the only published study using paired basic science and clinical facilitators was reported by Henderson et al. and based on a pilot study at McGill University in 2000.[13] In this very small study (2 PBL groups), facilitators were paired in 2 manners: group A paired a basic scientist with 'some' PBL experience with a 'junior' clinician with no PBL experience, and group B paired a 'senior' clinician with 'extensive' PBL tutoring experience with a basic scientist with no PBL experience. Student evaluations showed that in group A, in which the basic science tutor had PBL experience, the paired facilitator format was preferred over conventional single faculty tutoring (75%), led to improved understanding (80%), and that the paired facilitators exhibited complementary knowledge (84%). In group B, where the basic science facilitator was inexperienced in PBL, students were less sanguine: only 50% preferred the co‐tutors, none felt there was greater understanding, and none thought the tutors exhibited complementary knowledge. This small study suggested the possibilities for co‐tutoring as a way of enhancing student learning in a PBL format. It did not address student evaluations of the basic science or clinical facilitators. The authors suggested a potential advantage of this model: basic science tutors often felt uncomfortable tutoring alone as PBL cases became more clinical in later stages of medical education, and pairing them with a clinician might have improved their contribution and comfort level.[9] Of course, in an integrated PBL case, all facilitators will likely fall outside their 'comfort zone' at some time, and this is probably healthy. While we did not survey this 'comfort' issue directly, we did not detect any significant difference between Year 2 basic science and clinical faculty in enthusiasm for the cases or overall teaching experience, despite likely differences in comfort with the material (data not shown).

Conclusions and future directions

This study demonstrates that Year 1 medical students ranked basic science and clinical PBL facilitators equally when they taught side by side, while Year 2 students rated clinical faculty more highly. The reasons for this are likely to be many, and might be better addressed with further qualitative research assessing student attitudes towards faculty. As many medical schools seek to return to basic science in latter stages of the curriculum,[[14]] careful attention will need to be paid to student attitudes towards basic science faculty, and to ensuring that the context for basic science teaching is carefully selected for the level of student training. The paired PBL facilitator model used at our university has this potential: as at its best it demonstrates the collaboration of basic and clinical faculty in teaching and problem solving. Students will clearly accept and even relish the participation of basic scientists in their clinical education,[15] but this acceptance will be enhanced by careful design of the educational environment to emphasise clinical relevance.

<bold>Contributors: </bold> FTS contributed to study design, curricular design and writing this paper. CMB contributed to curricular design and writing this paper. RG‐E contributed to curricular design. VGK contributed to curricular design and study design.

<bold>Acknowledgements: </bold> the authors thank Minh Chau Barr and John Drummer for their tireless support of this course and collection of data. They also thank the faculty and student participants in the course for their enthusiasm and suggestions.

<bold>Funding: </bold> this study received no external funding.

<bold>Conflicts of interest: </bold> none.

<bold>Ethical approval: </bold> ethical approval was obtained via a waiver from the UCD Human Subjects Committee.

Overview

What is already known on this subject

Problem‐based learning is used to enhance students' active learning and problem solving ability.

Controversy exists whether facilitator expertise is a plus or minus.

Most PBL schools use single basic science and clinical facilitators, often randomly assigned.

What this study adds

Paired basic science and clinical facilitators effectively role model integrative learning.

Year 1 students rate basic science and clinical faculty equally.

Year 2 students rate clinical faculty higher.

Suggestions for further research

Do paired facilitators enhance student acceptance of basic science faculty after Year 1?

References

1 Maudsley G. Do we all mean the same thing by 'problem‐based learning'? A review of the concepts and a formulation of the ground rules. Acad Med 1999 ; 74 (2): 178 – 85.

2 Barrows HS. A taxonomy of problem‐based learning methods. Med Educ 1986 ; 20 (6): 481 – 6.

3 Dolmans DH, Gijselaers WH, Moust JH, De Grave WS, Wolfhagen IH, Van Der Vleuten CP. Trends in research on the tutor in problem‐based learning: conclusions and implications for educational practice and research. Med Teach 2002 ; 24 (2): 173 – 80. DOI: 10.1080/01421590220125277

4 Eagle CJ, Harasym PH, Mandin H. Effects of tutors with case expertise on problem‐based learning issues. Acad Med 1992 ; 67 (7): 465 – 9.

5 Gilkison A. Techniques used by 'expert' and 'non‐expert' tutors to facilitate problem‐based learning tutorials in an undergraduate medical curriculum. Med Educ 2003 ; 37 (1): 6 – 14. DOI: 10.1046/j.1365-2923.2003.01406.x

6 Kwizera EN, Dambisya YM, Aguirre JH. Does tutor subject matter expertise influence student achievement in the problem‐based learning curriculum at UNITRA Medical School? S Afr Med J 2001 ; 91 (6): 514 – 6.

7 Schmidt HGAA, Moust JH, Kokx I, Boon L. Influence of tutors' subject matter expertise on student effort and achievement in problem‐based learning. Acad Med 1993 ; 68 (10): 784 – 91.

8 Silver M, Wilkerson LA. Effects of tutors with subject expertise on the problem‐based tutorial process. Acad Med 1991 ; 66 (5): 298 – 300.

9 Maudsley G. The limits of tutors' comfort zones with four integrated knowledge themes in a problem‐based undergraduate medical curriculum (interview study). Med Educ 2003 ; 37 (5): 417 – 23. DOI: 10.1046/j.1365-2923.2003.01497.x

Custers EJFM, Ten Cate O. Medical students' attitudes towards and perception of the basic sciences: a comparison between students in the old and the new curriculum at the University Medical Centre Utrecht, the Netherlands. Med Educ 2002 ; 36 (12): 1142 – 50.

West M, Mennin SP, Kaufman A, Galey W. Medical students' attitudes toward basic sciences: influence of a primary care curriculum. Med Educ 1982 ; 16 (4): 188 – 91.

Kaufman DM, Mann KV. Basic sciences in problem‐based learning and conventional curricula: students' attitudes. Med Educ 1997 ; 31 (3): 177 – 80.

Henderson JE, Conochie LB, Steinert Y. Co‐tutors in the basis of medicine. Clin Invest Med 2000 ; 23 (1): 86 – 9.

Patel VL, Dauphinee WD. Return to basic sciences after clinical experience in undergraduate medical training. Med Educ 1984 ; 18 (4): 244 – 8.

Fenton C, Loeser H, Cooke M. Intersessions: covering the bases in the clinical year. Acad Med 2002 ; 77 (11): 1159. DOI: 10.1097/00001888-200211000-00024

By Frazier T Stevenson; Connie M Bowe; Regina Gandour‐Edwards and Vijaya G Kumari

Reported by Author; Author; Author; Author