Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 98 CSSHE SCÉES Canadian Journal of Higher Education Revue canadienne d’enseignement supérieur Volume 42, No. 1, 2012, pages 98-111 Putting Research Into Practice: Pedagogy Development Workshops Change the Teaching Philosophy of Graduate Students Peter J. T. White Michigan State University David Syncox McGill University Audrey Heppleston Natural Resources Canada Siara Isaac Université Claude Bernard Lyon 1 Brian Alters Chapman University ABSTRACT Teaching competence is an important skill for graduate students to acquire and is often considered a precursor to an academic career. In this study, we evaluated the effects of a multi-day teaching workshop on graduate teaching philosophies by surveying 200 graduate students, 79 of whom had taken the workshops and 121 who had not. We found no difference between groups (workshop attendees versus non-attendees) in their beliefs that (a) it is important to focus on in-depth learning of core concepts when teaching and (b) “memorization” is a poor learning strategy for students. On average, however, respondents who had taken the workshop allocated more in-class time for student-to-student discussions (interactive engagement) and placed less em- CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 99 phasis on lecturing. These results suggest that graduate students are generally aware of the importance of conceptual learning, but workshop attendees have clearer ideas on how to teach for effective learning. RÉSUMÉ La capacité d’enseigner est une compétence importante pour les étudiants en formation doctorale et est souvent considérée comme un attribut nécessaire à la poursuite d’une carrière académique. Lors de cette étude, nous avons évalué les effets d’un atelier de développement pédagogique de plusieurs jours sur la philosophie d’enseignement des étudiants en thèse en interrogeant 200 sujets - 79 qui avaient assisté aux ateliers et 121 qui n’y avaient pas participé. Nous n’avons trouvé aucune différence entre les groupes (ceux qui ont participé à l’atelier par rapport aux non-participants) dans leur croyance que (a) lors de l’enseignement, il est important de se concentrer sur l’apprentissage en profondeur des concepts principaux et (b) la «mémorisation» est une mauvaise stratégie d’apprentissage pour les étudiants. Cependant, en moyenne, les répondants qui avaient participé à l’atelier ont consacré plus de temps à des discussions entre étudiants (engagement interactif) et ont accordé moins d’importance aux cours magistraux. Ces résultats suggèrent que les étudiants de niveau doctoral sont généralement conscients de l’importance de l’apprentissage conceptuel, mais que ceux qui ont participé aux ateliers ont des idées plus claires pour faciliter un tel apprentissage. Over the last decade, North American universities and colleges have taken a new direction in graduate education by implementing professional development programs for their graduate students. In 2008, the Canadian Association for Graduate Studies (CAGS), the Society for Teaching and Learning in Higher Education, and the Canadian federal government’s Tri-Council funding agencies supported this initiative by starting a discussion of priorities, necessary skills, and actions to move the dialogue forward. This group promotes the development of programs for graduate students to ensure they acquire by the end of their degree the skills to perform well in what has become known as the knowledge economy (CAGS, 2008). In a 2010 benchmarking activity (Jenkins, 2010), more than 30 institutions were surveyed to assess the support for developing such skills. It was found that each of the schools surveyed had initiatives that ran parallel to the academic program and aimed at improving the graduate student skills identified by CAGS. Many of the institutions offered activities to help graduate students build the teaching skills enumerated as the characteristics of a Highly Qualified Person (CAGS, 2008). Teaching competence is increasingly recognized in academics as a complement to research skills (Brew, 2003; Prince, Felder, & Brent, 2007). The definition of teaching competence can vary from one discipline to another, but it broadly involves the facilitation of collaborative and cooperative learning among students (McKeachie, 2007). For newly graduated PhDs, an expectation of teaching competence generally emanates from teaching experiences completed during their degree. Graduate students become involved as teaching assistants, assistant/guest lecturers, lab instructors, teaching fellows, and even CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 100 full course instructors. As graduate programs are often a precursor to academic careers where a high level of teaching competency is expected, graduate students who develop these skills while in their graduate program will be better prepared for a professorship when they graduate. The characteristics of effective in-class teaching have been well established and go far beyond good presentation skills (Armbruster, Patel, Johnson, & Weiss, 2009; Crouch & Mazur, 2001; Smith et al., 2009). Graduate students typically develop oration skills for presentations at conferences, but these skills are different from good teaching (Mazur, 2009; McKeachie, 2007). Among other features, good teaching promotes peer-learning opportunities (Crouch & Mazur, 2001) and in-class student-to-student (peer) interaction (McKeachie & Svinicki, 2010). Teaching that incorporates these features has a better chance of obtaining the desired learning outcomes (Davies, 2002; Ernst & Colthorpe, 2007), and research shows that peer interaction stimulates high-level cognitive activity leading to higher learning gains (Cohen, 1994; O’Donnell & King, 1999; Webb & Palincsar, 1996). In a meta-analysis comparing interactive engagement pedagogies to traditional lecture-only pedagogies, Hake (1998) showed that the former consistently outperformed the latter as reflected in students’ pre- to post-course knowledge evaluations. There are now three popular approaches pursued, often concurrently, in the development of graduate teaching skills at Canadian universities. These approaches are (a) independent presentations and short workshops, (b) credit graduate courses, and (c) a certificate program where participants engage in a substantial number of the aforementioned activities in addition to evaluated activities such as reflective writing, literature reviews, and teaching. Examples of these programs can be found at Queen’s University (Queen’s CTL, 2010), the University of British Columbia (UBC, 2010), and the University of Western Ontario (UWO, 2010). It appears that the greatest number of graduate students participate in the first of these popular approaches (independent presentations and short workshops). Although this approach is well intentioned, it is unclear how effective short interventions are in helping graduate students develop effective teaching pedagogies. Moreover, similar short workshop formats have been found to be not necessarily effective for professors (McAlpine & Winer, 2002). A fourth and less common approach is that of multi-day intensive workshops. This approach, demonstrated to be effective for professors (Saroyan & Amundsen, 2004), is taken by the Faculty of Science at McGill University (Harris & McEwen, 2009; T-PULSE, 2010), the Faculty of Agriculture at McGill (www.mcgill.ca/miti), and by the Department of Physics and Astronomy at UBC (http:// www.physics.ubc.ca/~phas_ta/). The objective of this paper is to test the effectiveness of multi-day pedagogical development workshops on graduate teaching practice and philosophies at a research-intensive university in eastern Canada. The aim of these workshops is to provide participants with teaching strategies they can immediately apply to their teaching situation, and to facilitate participants’ development of their own teaching philosophy by engaging them in activities focused on a constructivist, learning-centred approach. These elements, combined with time for reflection on personal approaches toward teaching, are designed to help with the development of a teaching philosophy. CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 101 METHODS At McGill University, pedagogy development workshops for science graduate students were started in 2003 by the Tomlinson Project in University-Level Science Education (T-PULSE, 2010). Over the past nine years, the workshops have evolved to include modules on learning outcomes, interactive engagement strategies, learner diversity, formative evaluation, and grading. Each multi-day workshop (in its entirety) incorporates approximately 12 hours of training. The curricula for the workshop modules were researched and created by graduate student teaching fellows working for T-PULSE, who at times collaborated with faculty members from McGill’s Teaching and Learning Services and Faculty of Education. Currently, the workshops are facilitated by a group of graduate student teaching fellows from scientific disciplines under the supervision and guidance of interim director Dr. Nicholas de Takacsy and the T-PULSE office. (The former director, Dr. Brian Alters, is a co-author on this paper.) From 2003 to 2011, the workshops have been offered more than 30 times and have engaged more than 750 graduate student participants. In that time, more than 25 different graduate students have served as teaching fellows on the workshop facilitation team. We collected two datasets for this study. The first dataset was collected from a group of 40 graduate students at the January 2009 edition of the T-PULSE pedagogy development workshops. We conducted pre- and (immediate) post-workshop surveys to gauge how the teaching philosophies of participants may have changed over the course of the two-day event. These responses will be referred to as pre-workshop and post-workshop responses throughout the rest of this paper. The second dataset was collected in February 2009. We sent an electronic online survey to approximately 850 graduate students across the departments of the Faculty of Science at McGill University. This pool of students comprised master’s and PhD candidates who had or who had not taken the T-PULSE pedagogy development workshops. Participation in the survey was anonymous or confidential (depending on whether respondents wanted to be entered into a prize-draw). Data from these online surveys will be referred to as YWS-online and NWS-online responses throughout the rest of this paper. YWS stands for Yes WorkShop and includes answers from online respondents who had taken the multi-day T-PULSE workshops. NWS stands for No WorkShop and includes answers from online respondents who had not taken the multi-day T-PULSE workshop or those who had taken minimal pedagogical training through independent presentations or short workshops; that is, they had participated in the first popular approach identified earlier. The survey we used in both sets gathered data on respondent conceptions of good teaching and the value of lecturing as an effective teaching and learning strategy. By using the same survey instrument for our January 2009 and February 2009 surveys, we sought to determine if the workshops actually caused a change in teaching philosophy, or if those who attended the workshops were simply predisposed to having interactive teaching philosophies already in place. In our survey, we asked three demographic questions and six informative teachingphilosophy questions. The demographic questions gathered data on the respondents’ teaching development activities and teaching experience, including a self-assessment of teaching abilities. The teaching-philosophy questions gathered data on what the respondents identified as good teaching by asking them to list the qualities of good teaching CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 102 and to identify the primary responsibilities of instructors. These latter questions were designed to allow us to identify objectivist versus constructivist teaching philosophies among respondents. See Appendix 1 for the full survey instrument. We used t tests and MANOVA tests to analyze the survey response data. T tests are appropriate statistical tools for testing whether the means of two sets of responses are different. We used this test for Questions 2 and 3 (separately) to compare the numerical spread of responses among workshop participants versus non-workshop participants (for example, a = 5, b = 4, c = 3, d = 2, e = 1). A MANOVA test (Multiple Analysis of Variance) is appropriate for testing whether the means of multiple dependent variables vary with respect to a categorical independent variable. In our case, this tool allows us to test whether the answer profiles from Questions 6, 7, 8, and 9 (ensemble) differ among workshop participants versus non-workshop participants. The advantage of using this type of analysis is that it tests for differences in many questions simultaneously, thus reducing the number of statistical tests required. When a significant p value is returned, the means of the four dependent variables are analyzed to determine which variables are the likely causes of the significant difference. For these tests, we used a similar scoring system (that is, a = 5, b = 4, and so on), as in Questions 2 and 3. Question 9, however, was reverse-scored so that higher values corresponded to more time allotted for peer-to-peer interactions (that is, a = 1, b = 2, c = 3, d = 4, e = 5). Although the categories in Question 9 are not equal interval categories, we opted to use this scoring so that it would match the scoring used in Questions 6 to 8. Furthermore, numerical categories matching the average minute-values of each category were not optimal because the fifth category had an unspecified upper limit (that is, more than 20 minutes). To further explore our data, we performed three additional MANOVA tests relating to the answer profiles for Questions 6, 7, 8, and 9. First, we tested for differences between pre-workshop responses and NWS-online responses; that is, neither group had taken the T-PULSE workshops. Second, we tested for differences between post-workshop responses and YWS-online responses; that is, both groups had taken the workshops. Third, among YWS-online responses, we tested to see if answer profiles for Questions 6, 7, 8, and 9 differed based on the amount of time that had passed since the workshop was taken; that is, we based our test on the data collected in Question 1(a)(i). To facilitate this analysis, we subdivided YWS-online responses into four groups corresponding to those who took the workshop in the current academic year, one year ago, two years ago, or more than two years ago. For all of the aforementioned statistical analyses (the two t tests and the 5 MANOVA tests), we used Bonferroni-adjusted alpha levels of .007 (.05/7 total tests) to determine statistical significance. Statistical analyses were not run on Questions 4 and 5. Instead, the results are displayed in table and graph format (respectively) for visual comparison. RESULTS Of the 850 graduate students we targeted with our February 2009 online survey, 200 responded. Of these students, 79 were former T-PULSE workshop participants (YWSonline) and 121 were not (NWS-online). Of the 79 YWS-online respondents, 27 had taken the workshop in January 2009, 14 had taken it one year prior, 22 had taken it two years prior, and 16 had taken it more than two years prior. Of the 121 NWS-online respondents, 93 had never participated in formal pedagogy development training (workshops, semi- CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 103 nars, and so on). For the sake of simplicity, we will refer to all 121 respondents in this category as NWS-online respondents. YWS-online respondents had 2.6 years of teaching experience on average compared to 2.0 years of experience on average among NWSonline respondents. This difference was not significant (Question 2: t198 = 2.26, SEM = .18, p = .026). YWS-online respondents rated their teaching ability as 4.8/7 on average compared to 4.4/7 on average among NWS-online respondents. This difference was also not significant (Question 3: t198 = 0.88, SEM = .15, p = .38). In our February 2009 online survey, when we asked respondents to list qualities of good teaching (Question 4), YWS-online respondents more frequently listed Interactive Engagement and Results in Student Learning, whereas NWS-online respondents more frequently listed Good Rapport With Students and Innovative/Energetic (Table 1). When asked what the Primary Responsibilities of a course instructor should be (Question 5), YWS-online respondents chose Encourage Peer-Peer Interaction more often than NWSonline respondents (Figure 1). YWS-online respondents chose Deliver a Good Lecture and Make Sure Course Material Is Current less frequently than did NWS-online respondents, but these differences were less pronounced. In Figure 1, the dashed lines separate the online survey data (left) from the data collected at the January 2009 workshop (right). The large difference in percentages between the two datasets (for example, in the Deliver a Good Lecture category) is present because online respondents were asked to “choose all that apply” and pre-post workshop respondents were asked to choose only one answer. The online data that are presented therefore show the percent of responses, whereas the pre-post workshop data show the percent of respondents. Table 1. Qualities of Good Teaching Identified by Survey Respondents Qualities of Good Teaching Good lecturing Uses interactive engagement Good knowledge of subject matter Good rapport with students/asks and answers questions Patience Tracks student progress during course Encourages learning independence Innovative/energetic/enthusiastic Awareness of diverse learner backgrounds Uses good evaluation methods Well organized/prepared “Clarity” Results in student learning Other NWS* Responses (%) YWS* Responses (%) 42 38 11 28 10 9 32 23 15 6 8 6 7 6 26 21 5 10 6 6 11 9 5 6 2 12 13 15 Note. *NWS responses were from respondents who had not taken the T-PULSE teaching workshops; YWS responses were from respondents who had taken the T-PULSE teaching workshops. CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 104 Figure 1. Primary Responsibilities of a Course Instructor as Listed by Survey Respondents In the pre-post dataset, MANOVA analyses indicated that the way pre-workshop respondents and post-workshop respondents answered Questions 6, 7, 8, and 9 (ensemble) was significantly different (F4, 70 = 4.9, MSE = 4.7, p = .0016). The biggest differences between these two respondent groups were in their answers to Questions 6 and 9. Postworkshop respondents agreed less with the effectiveness of lecturing (average response 3.0/5 post-workshop versus 2.4/5 pre-workshop) (Table 2) and allotted more in-class time for peer-to-peer interactions (average response 3.1/5 post-workshop versus 2.2/5 pre-workshop) (Figure 2). Similarly, in the online dataset, a MANOVA analysis found that the answers given by YWS-online respondents and the answers given by NWS-online respondents also differed significantly (F4,195 = 7.0, MSE = 4.4, p < .0001). YWS-online respondents agreed less with the effectiveness of lecturing (average response 2.6/5 for YWS versus 3.1/5 for NWS) (Table 2) and allotted more in-class time for peer-to-peer interactions (average response 2.9/5 for YWS versus 2.2/5 for NWS) (Figure 2). In Figure 2, the dashed lines separate the online survey data (left) from the data collected at the January 2009 workshop (right). MANOVA analyses did not, however, identify significant differences in the way that pre-workshop respondents and NWS-online respondents answered Questions 6, 7, 8, and 9 (F4,153 = 0.49, MSE = 4.5 p = .74). Neither was there a significant difference in the way that post-workshop respondents and YWS-online respondents answered this set of questions (F4,112 = 1.9, MSE = 4.2 p = .12). Among YWS-online responses, the time since the workshop was taken was not significantly related to differences in answer profiles (F4,74 = 1.8, MSE = 4.3, p = .13). CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 105 Table 2. Teaching and Learning Beliefs Among Survey Respondents 6. In general, listening to lectures is one of the best ways for students to understand course material. 5. Strongly 4. Agree 3. Neutral 2. Disagree 1. Strongly Average agree (%) (%) (%) (%) disagree (%) score Pre-workshop 10 30 20 28 13 3.0 Post-workshop 5 10 15 55 15 2.4 NWS-online 11 27 28 28 6 3.1 YWS-online 8 18 22 38 15 2.6 7. It is better to cover a lot of course content than to promote a deep understanding of a few key concepts. 5. Strongly 4. Agree 3. Neutral 2. Disagree 1. Strongly Average agree (%) (%) (%) (%) disagree (%) score Pre-workshop 0 13 23 49 15 2.3 2.3 Post-workshop 3 15 10 51 21 NWS-online 1 8 18 50 23 2.1 YWS-online 0 9 15 53 23 2.1 8. In introductory courses, instructors should encourage “memorization” as one effective strategy for learning core concepts. 5. Strongly 4. Agree 3. Neutral 2. Disagree 1. Strongly Average agree (%) (%) (%) (%) disagree (%) score Pre-workshop 5 10 8 46 31 2.0 Post-workshop 0 10 8 51 31 2.2 NWS-online 4 16 21 32 27 2.4 YWS-online 4 13 25 35 23 2.4 How much time would you set aside for peer-peer interactions in a 60 minute class? Figure 2. The Amount of Time Allotted by Survey Respondents for Peer-Peer Interactions CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 106 DISCUSSION The different groups of graduate students surveyed had convergent ideas on what constitutes good learning—particularly that memorization is an insufficient learning technique for acquiring deep understanding of key concepts. There was, however, an important difference in perspective between those who did and those who did not attend the multi-day T-PULSE workshops on the utility of lectures. Those who did not attend the workshop held the position that lectures are an effective way to learn. This view expresses a teacher-centred approach without significant focus on student learning or peer-peer interactions. In contrast, those who took the workshop expressed a more learning-centred approach in that they found lecturing to be of limited value for student learning. These respondents were also likely to plan more time for in-class peer discussion (Figure 2). These two components (more interactive engagement and less focus on lecturing) have been well established as part of a high-value teaching philosophy that leads to higher learning gains (Brooks, 1984; Hake, 1998; Mazur, 2009; Smith, et al., 2009). Thus, it seems that graduate students who have taken the workshops not only have a good idea of what constitutes good teaching but also how to accomplish it in practice. This finding suggests that a successful workshop model should educate attendees about effective learning-centred pedagogies rather than teaching attendees the merits of deep conceptual learning as compared to those of memorization. The general consistency of the answers of post-workshop respondents (January 2009) and YWS-online respondents demonstrates that a pedagogy change among attendees is not limited to a temporary change occurring immediately post-workshop. There was also no statistical evidence that the change in teaching philosophy erodes with time. The exception to this was that post-workshop respondents disagreed slightly more with the utility of memorization than did online respondents who had taken the workshop (Table 2). Both sets of responses were still negatively disposed toward memorization. There also seemed to be a small difference in the amount of time that post-workshop respondents would give for in-class discussion compared to online respondents who had taken the workshop (Figure 2). This difference did not result in significant MANOVA analyses and only represents a marginal difference, as the most common tendency in both groups was to allow six to 10 minutes of interaction in each 60-minute class. These very small differences notwithstanding, the overall suite of results suggests that the T-PULSE workshops at McGill University assist graduate students to develop key components of their teaching philosophy, better preparing them for their current teaching roles and future professorial careers. Research suggests that professors who learn about effective teaching strategies early in their careers are more inclined to permanently adopt a learning-centred approach (Gandell, Weston, Finklestein, & Winer, 2000; Weston & Cranton, 1986). Although changes in teaching philosophy among graduate students can be facilitated through teaching workshops, it can be comparatively difficult to stimulate pedagogical change across a department or institution. Our results suggest that a viable solution to this problem may require a long-term focus, training the next generation of professors to lead the shift to effective in-class pedagogy. The impact of an investment in the pedagogical development of graduate students may be enhanced by the broad use of teaching portfolios in recruitment and tenure decisions, making well-prepared graduate students more competitive in the professorial job market. The importance of training new profes- CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 107 sors for their teaching roles has been clearly established (Sorcinelli, 1994), particularly because new teachers quickly form lasting approaches and attitudes toward teaching (Boice, 1996). Because most future professors start teaching in graduate school, targeting graduate students with pedagogy development workshops can have a large impact on the future quality of instruction at Canadian universities. REFERENCES Armbruster, P., Patel, M., Johnson, E., & Weiss, M. (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sciences Education, 8(3), 203–213. Boice, B. (1996). Classroom incivilities. Research in Higher Education, 37(4), 453–486. Brew, A. (2003). Teaching and research: New relationships and their implications for inquiry-based teaching and learning in higher education. Higher Education Research & Development, 22(1), 3–18. Brooks, D. W. (1984). Alternatives to traditional lecturing. Journal of Chemical Education, 61, 858–859. Canadian Association for Graduate Studies (CAGS). (2008). Professional skills development for graduate students. Retrieved from http://www.cags.ca/documents/publications/Prof%20 Skills%20Dev%20for%20Grad%20Stud%20%20Final%2008%2011%2005.pdf Cohen, E. G. (1994). Restructuring the classroom: Conditions for productive small groups. Review of Educational Research, 64, 1–35. Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970–977. Davies, C. H. J. (2002). Student engagement with simulations: A case study. Computers & Education, 39(3), 271–282. doi: 10.1016/S0360-1315(02)00046-5 Ernst, H., & Colthorpe, K. (2007). The efficacy of interactive lecturing for students with diverse science backgrounds. Americal Physiological Society, 31, 41–44. Gandell, T., Weston, C., Finklestein, A., & Winer, L. (2000). Appropriate use of the web in teaching in higher education. In B. Mann (Ed.), Perspectives in web course management (pp. 61–68). Toronto, ON: Canadian Scholars’ Press. Hake, R. R. (1998). Interactive-engagement versus traditional methods: A sixthousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66, 64–74. Harris, D., & McEwen, L. A. (2009). A graduate teaching assistant workshop in a faculty of science. Canadian Journal of Higher Education, 39(2), 101–120. Jenkins, T. M. (2010). Professional Development Programs for Graduate Students: Overview of Findings from Selected Canadian and International Institutions (Internal Document). Teaching and Learning Services, McGill University, Montreal, Quebec. Mazur, E. (2009). Farewell, lecture? Science, 233, 50–51. McAlpine, L., & Winer, L. (2002). Sustainable faculty development: An Indonesian case study. Innovations in Education & Teaching International, 39(3), 205–216. CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 108 McKeachie, W. J. (2007). Good teaching makes a difference—And we know what it is. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 457-474). Dordrecht, The Netherlands: Springer. McKeachie, W. J., & Svinicki, M. (2010). McKeachie’s teaching tips: Strategies, research, and theory for college and university teachers (13th ed.). Belmont, CA: Cengage Learning. O’Donnell, A. M., & King, A. (1999). Cognitive perspectives on peer learning. Mahwah, NJ: Lawrence Erlbaum. Prince, M. J., Felder, R. M., & Brent, R. (2007). Does faculty research improve undergraduate teaching? An analysis of existing and potential synergies. Journal of Engineering Education, 96(4), 283–294. Queen’s Centre for Teaching and Learning (Queen’s CTL). (2010). Students— Community—Centre for Teaching and Learning. Retrieved from http://www.queensu. ca/ctl/index.html Saroyan, A., & Amundsen, C. (Eds.). (2004). Rethinking teaching in higher education: From a course design workshop to a faculty development framework. San Francisco, CA: Stylus. Smith, M., Wood, W., Adams, W., Wieman, C., Knight, J., Guild, N., & Su, T. T. (2009). Why peer discussion improves student performance on in-class concept questions. Science, 323, 122–124. Sorcinelli, M. D. (1994). Effective approaches to new faculty development. Journal of Counseling & Development, 72(5), 474–479. T-PULSE. (2010). Graduate science teaching workshops. Retrieved from http://www. mcgill.ca/tpulse/workshop University of British Columbia (UBC). (2010). . Retrieved from UBC Centre for Teaching, Learning and Technology: http://ctlt.ubc.ca/programs/all-our-programs/tatraining-program/ University of Western Ontario (UWO). (2010). School of Graduate and Postdoctoral Studies. Retrieved from http://grad.uwo.ca/ Webb, N. M., & Palincsar, A. S. (1996). Group processes in the classroom. In C. D. Berliner & R. C. Cafree (Eds.), Handbook of education psychology (pp. 841-873). New York, NY: Simon & Schuster MacMillan. Weston, C., & Cranton, P. A. (1986). Selecting instructional strategies. Journal of Higher Education, 57(3), 259–288. CONTACT INFORMATION Peter J. T. White Lyman Briggs College, Michigan State University East Lansing MI 48823 USA [email protected] CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 109 Peter J. T. White is a postdoctoral research associate at Michigan State University, where he is leading a National Science Foundation–funded project to develop integrative case studies in evolution education. He has a PhD in Biology from McGill University, where he studied entomology and forest ecology. Concurrently with his PhD studies, he worked with McGill’s T-PULSE Science Graduate Teaching team and developed constructivist pedagogical instructional materials, designed a team-building workshop, and led several pedagogy research and development initiatives. David Syncox is a graduate education officer at McGill University. He manages McGill’s graduate education initiatives on behalf of Graduate and Postdoctoral Studies and Teaching and Learning Services. Specifically, he oversees SKILLSETS, a suite of educational workshops for graduate professional skills development. The offerings and events aim to enhance the graduate experience beyond discipline-specific training, including the development of teaching competence, communication skills, management skills, and a basic understanding of ethical principles. In addition, he provides development opportunities for faculty members through graduate supervision offerings. Audrey Heppleston is a policy analyst in science and policy integration at Natural Resources Canada. She was initiated into science education research during her PhD in Biology at McGill University. Her research in education focused on the acquisition of scientific skills and the progression of scientific concepts in novice students. Siara Isaac is a faculty developer and team leader for the Pôle Pédagogie at the Université Lyon 1. She is responsible for teaching development workshops, accompanying teaching improvement initiatives, and course evaluations. She has teaching experience at universities in Canada, China, and France, ranging from practical lab settings to courses for 1,000 students. She has facilitated pedagogical development workshops in Canada, France, Morocco, and Switzerland. She is also a past coordinator of the T-PULSE Science Graduate Teaching Workshops at McGill University and a founding member of the PENSERA network of faculty developers in the Rhone–Alps region of France. Brian Alters is a joint professor in the College of Educational Studies and Schmid College of Science at Chapman University, where he is founder and director of the Chapman University Evolution Education Research Center. He is also cross-appointed at McGill University, where he is an associate professor of Education and the Sir William Dawson Scholar. He is both founder and director of the McGill University’s Evolution Education Research Centre. He has taught science education at Harvard and McGill Universities, and is regarded as a specialist in evolution education. ACKNOWLEDGEMENT We are grateful to philanthropist Dr. Richard H. Tomlinson, O.C. for his generous funding and support made available through the Tomlinson Project in University-Level Science Education at McGill University. CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters 110 APPENDIX I: SURVEY INSTRUMENT 1. Have you taken a teaching-development workshop or seminar before? a. Yes – The T-PULSE Graduate Teaching (Pedagogy Development) Workshop i. If yes, when did you take the T-PULSE Workshop? b. Yes – A McGill Teaching and Learning Services Seminar c. Yes – A different McGill teaching seminar or workshop d. Yes – a non-McGill teaching seminar or workshop e. No 2. How long have you been teaching for (including time as a teaching assistant)? a. I have not taught before b. Less than 1 year c. 1 or 2 years d. 3 or 4 years e. More than 4 years 3. How would you rate yourself as a teacher compared to others at McGill in your field? a. I am one of the best b. I am considerably above average c. I am slightly above average d. I am average e. I am slightly below average f. I am considerably below average g. I am one of the worst h. N/A I have never taught before 4. Give two characteristics of good teaching (open ended, asked in the February 2009 survey only). 5. What should the primary responsibilities of a course instructor be (choose two options): a. To deliver clear and concise lectures b. To answer student questions (both in and out of class) c. To encourage peer-peer interactions among students d. To stay up-to-date on subject matter in their field of instruction e. To design appropriate means for evaluating students (quizzes, tests, exams) f. To design appropriate means for students to give feedback on the instructor’s teaching 6. In general, listening to lectures is one of the best ways for students to understand course material (choose one). CJHE / RCES Volume 42, No. 1, 2012 Putting Research Into Practice / P. J. T. White, D. Syncox, A. Heppleston, S. Isaac & B. Alters a. b. c. d. e. 111 Strongly Agree Agree Neutral Disagree Strongly Disagree 7. It is better to cover a lot of course content than to promote a deep understanding of a few key concepts. a. Strongly Agree b. Agree c. Neutral d. Disagree e. Strongly Disagree 8. In introductory courses, instructors should encourage “memorization” as one effective strategy for learning core concepts.* a. Strongly Agree b. Agree c. Neutral d. Disagree e. Strongly Disagree 9. When planning a 60-minute lecture, you should typically set aside the following amount of time for students to interact with one another about the subject matter while they are in class (choose one). ** a. 0 minutes b. 1 to 5 minutes c. 6 to 10 minutes d. 11 to 20 minutes e. More than 20 minutes * This question was asked differently in the January 2009 workshops: “The fundamentals of undergraduate science courses should be taught as a series of facts that the students must learn.” ** We created the response in an attempt to create meaningful and distinct categories. We felt this was preferable to an equal-interval approach. CJHE / RCES Volume 42, No. 1, 2012
Author
Université Claude Bernard Lyon 1
Author
McGill University
Author
McGill University
Author
McGill University
Author