Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 94 CSSHE SCÉES Canadian Journal of Higher Education Revue canadienne d’enseignement supérieur Volume 46, No. 4, 2016, pages 94 - 114 Quantifying Interprofessional Learning In Health Professional Programs: The University of Manitoba Experience Ruby E. Grymonpre, Heather J. Dean, Pamela F. Wener, A. Elizabeth Ready, Laura L. MacDonald, Maxine E. Holmqvist, and Moni W. Fricke University of Manitoba Maria James Manitoba Agriculture, Food and Rural Development Abstract Internationally, a growing number of interprofessional education (IPE) offices are being established within academic institutions. However, few are applying educational improvement methodologies to evaluate and improve the interprofessional (IP) learning opportunities offered. The University of Manitoba IPE Initiative was established in 2008 to facilitate the development of IP learning opportunities for pre-licensure learners. The research question for this secondary analysis was: what, if any, changes in the number and attributes of IP learning opportunities occurred in the academic year 2008–2009 compared to 2011–2012? The Points for Interprofessional Scoring (PIPES) tool was used to quantify the attributes of each IP learning opportunity. Most notably in 2012, eight (73%) of 11 IP learning opportunities achieved the highest PIPES score (> 55), compared to only four (36%) in 2009. The concept of the PIPES score is introduced as an educational improvement strategy and a potential predictor of achieving the desired educational outcome: collaborative competence. CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 95 Résumé Les institutions académiques du monde accueillent de plus en plus de bureaux d’éducation interprofessionnelle. Par contre, très peu mettent en pratique des méthodologies d’amélioration de l’enseignement pour évaluer et améliorer ces occasions d’apprentissage interprofessionnelles (OAI). En 2008, afin de faciliter la création de telles occasions pour les apprenants avant l’obtention de leur permis de pratique, l’Université du Manitoba lançait l’initiative d’éducation interprofessionnelle. Elle voulait savoir s’il existait des différences dans les nombres et les attributs des OAI entre les années 2008-2009 et 20112012. Les attributs de chaque OAI ont été quantifiés en utilisant une version adaptée de l’outil « Points for Interprofessional Scoring » ou (PIPEs). En 2012 notamment, 73 % d’occasions d’apprentissage interprofessionnel (soit 8 sur 11) avait atteint le score PIPES (>55) le plus élevé, comparativement à 36 % en 2009. Le concept du score « PIPES » est présenté comme une stratégie d’amélioration du secteur de l’éducation et comme un potentiel de prédiction du résultat éducationnel désiré : une compétence collaborative. Interprofessional education (IPE) is an emerging global priority for educators in health professional programs. Evidence suggests that when health professionals are educated together and have knowledge of not only their own skills, but also the skills and attributes of other members of the healthcare team, and when these teams work together with the patient to decide on a course of treatment, the safety, quality, and access to care are better, and patient outcomes improve. Further, highly collaborative teams experience reduced tensions and conflict, leading to improved job satisfaction, recruitment, and retention among healthcare providers (Health Force Ontario, 2010; Reeves, Perrier, Goldman, Freeth, & Zwarenstein, 2013; Zwarenstein, Goldman, & Reeves, 2009). However, as illustrated by the D’Amour Interprofessional Education for Collaborative Person-Centred Practice (IECPCP): Evolving Framework, structuring an IPE curriculum to achieve collaborative competence is complex and influenced by an array of political, regulatory, geographical, institutional, instructional, student, and faculty characteristics and dynamics (D’Amour & Oandasan, 2005). Recognizing these diverse and often immutable influences, Freeth (2004) suggested that IP educators apply the 3P model of learning and teaching in their planning of an IPE curriculum. This model identifies multiple “presage” (contextual) and “process” (delivery) factors that need to be considered when planning interprofessional (IP) learning opportunities to achieve the desired “product” of collaborative competence. Similarly, Olson and Bialocerkowski (2014, p. 242–243) have suggested moving away from the “single factor cause-effect thinking” to “how different types of IPE produce different types of outcomes within particular learning environments.” Further, Cooper (2004) provided a compelling argument for using the complexity theory to guide IPE developments. With respect to the challenges of implementing and evaluating IPE, one fundamental consideration is the development of a pedagogically informed IPE curriculum (Oandasan & Reeves, 2005). IP learning opportunities informed by educational theories and learning strategies such as adult learning theory, case-based learning, small group learning, cooperative learning, and reflection on practice promise to improve attitudes, modify stereotypical CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 96 thinking, and transform learners to a more collaborative mode of practice (D’Eon, 2004; Oandasan & Reeves, 2005). In a longitudinal study of learner attitudes towards IPE and teamwork, Curran, Sharpe, Flynn, and Button (2010) identified the IP learning process, as informed by educational theory and IPE principles, as a key success factor. In a systematic review to identify the best approach to pre-licensure, university-based, allied-health IPE, Olson and Bialocerkowski (2014) noted that case-based IPE activities involving small, stable groups of learners were perceived by participants as more relevant and successful. Centralized IPE offices are emerging to develop, implement, and evaluate pedagogically sound IP learning opportunities; part of this process includes the creation of an inventory of such activities. However, it is not apparent whether any of these offices are systematically evaluating the attributes of the IP learning opportunities in terms of how they align with educational theory and IPE principles. In their systematic review, Olson and Bialocerkowski (2014) noted that most evaluations of IPE interventions lack a theoretical foundation, are short-term, and assess such low-level outcomes as student readiness, attitudes, and interactions. The Medical University of South Carolina, Dalhousie University, Memorial University, and the Western University of Health Sciences have each published a description of the curricular and co-curricular IP learning opportunities offered and evaluated through their IPE offices (Aston, Mackintosh & Orzoff, 2010; Blue, 2010; Curran et al., 2010; MacKenzie & Merritt, 2013). Consistent with Olson and Bialocerkowski’s (2014) findings, student assessment was the focus of these evaluations as opposed to an examination of the attributes of each IP learning opportunity for the purposes of improvement. Moreover, there is a dearth of published data assessing whether establishing a centralized IPE office has any measurable impact on the quantity and attributes of IP learning opportunities. The University of Manitoba (UofM) IPE Initiative (the Initiative) was formed in 2008 by the deans of 13 health science academic units within the university. Establishing a centralized, university-wide office of IPE was viewed by UofM senior administrators as a critical step for exploring the development and implementation of IP learning opportunities. The primary goal of the Initiative was to help educate future health professionals as leaders and change agents who in turn will bring this way of thinking into the public domain for the benefit of the Manitoba health system and its recipients. Prior to the establishment of the Initiative, the concept of IPE within the UofM was relatively novel. In 2003, under a Health Canada-funded contract, Cook (2004) completed an environmental scan of IPE models utilized by Canadian universities. In his results, he noted that “there appears to be no active program of interprofessional learning at [the University of Manitoba]” (Cook, 2004, p. 25). In 2005, through the Health Canada Interprofessional Education for Collaborative Person-Centred Practice Initiative, the UofM received funding for two demonstration projects. These projects created a growing awareness and acceptance of the benefits of IPE and interprofessional collaboration (IPC) and supported the educational development of 15 UofM IPE ambassadors. These educators developed and offered learning opportunities in which pre-licensure learners from different health education programs learned together; they did so at a time when the “what and how” of IPE was in its infancy but the spirit was “start where you can and learn by doing.” Setting the bar for IPE at the time was the Accreditation of Interprofessional Health Education (AIPHE) project, which included a definition of what IPE is (and what it is not), CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 97 as well as guiding principles for the integration of IPE standards into professional education (AIPHE, 2009). Further, the Points for Interprofessional Education System (PIPES) is an instrument developed in 2009 by the University of Toronto, outlining the attributes of a pedagogically grounded IP learning opportunity (University of Toronto, 2009). As part of the Initiative’s strategic planning process and in an effort to harmonize language, reduce confusion, and avoid misunderstanding, the definition of IPE proposed by the Centre for the Advancement of Interprofessional Education (CAIPE) was adopted and operationalized: “occasions when two or more professions learn with, from and about each other to improve collaboration and the quality of care” (CAIPE, 2002). The Initiative also collectively endorsed the Canadian Interprofessional Health Collaborative (CIHC) National Interprofessional Competency Framework, which outlines six competency domains or desired behaviours for a collaborative practitioner, including: exploring the roles and responsibilities of healthcare practitioners; patient/family centredness; conflict resolution; shared leadership and decision making; and team dynamics and communication (CIHC, 2010). Further, the Initiative set out to capture the IP learning opportunities in terms of how they aligned with the emerging understanding of “what is and what is not IPE,” and they chose to use the PIPES instrument to do so. The research question for this study was: What, if any, changes in the number and attributes of IP learning opportunities occurred in the academic year 2008–2009 compared to 2011–2012? Methods Ethical Considerations This study was reviewed and approved by the University of Manitoba Health Research Ethics Board. Research Design In its commitment to continuous educational improvement in IPE, the Initiative conducted regular surveys to evaluate the attributes of the IP learning opportunities offered. The research design for this project was a case-study comparative analysis of IPE offerings at the UofM between two academic years. The methodology involved a longitudinal secondary analysis of the data gathered from surveys conducted during the 2008–2009 and 2011–2012 academic years (University of Manitoba Interprofessional Initiative, 2014). Finlayson, Egan, and Black (1999, p. 84) defined secondary analysis as the “reexamination of previously collected data.” Questionnaire Development and Data Collection As an improvement tool, the items in the 2009 questionnaire were guided by the operational definitions and principles outlined in the AIPHE principles document and the PIPES (AIPHE, 2009; Wagner, Langlois, Lowe, & Simmons, 2009). Questionnaire items were developed for one of two purposes: to gather specific information on each IP learning opportunity (IPLO) so as to permit PIPES scoring, and to document additional information of interest on each IPLO (e.g., mandatory, elective, or voluntary offering; numbers CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 98 of students; student assessment; educational approach). For the purposes of this study, IPLOs were labelled “embedded” if they were a mandatory component of a mandatory or elective course as opposed to voluntary student involvement in extracurricular or notfor-credit activities or events. This definition is consistent with that articulated by Lawlis, Anson, and Greenfield (2014). In 2009, the questionnaire containing 15 items was piloted by two teams of course instructors prior to its distribution. The 2012 questionnaire was further revised, resulting in seven additional items, as some of the 2009 items were noted to be ambiguous and required either rewording, splitting into more than one question, or the addition of new items for further clarification. The questionnaire was emailed by the Initiative to members of the IPE Liaison Advisory committee in August 2009 (as a PDF) and March 2012 (using Survey Monkey), requesting that it be distributed widely among their faculty and that all completed questionnaires be returned within 30 days. Faculty were asked to complete the questionnaire if they offered a learning opportunity in which (i) students from two or more different professions were brought together to learn together and (ii) there was an opportunity within the session for the students to interact. Each questionnaire requested information on only those IPLOs offered in that respective academic year (2008–2009 and 2011– 2012). For any one IPLO, the relevant course instructors were requested to work collaboratively so only one questionnaire was completed, regardless of how many academic units were involved. For both years, the questionnaire was also emailed to the president of the Manitoba Health Sciences Student Association (MaHSSA) to gather data on any student-inspired activities. For this secondary analysis, the research team found the need to adapt the PIPES instrument (with permission) to improve its content and face validity. The original PIPES listed eight attributes of a pedagogically informed IPLO, falling within two constructs (process of learning and content). These attributes were developed by the University of Toronto IPE Office using a modified Delphi process involving an expert panel who participated in two rounds of decision making to select and rank key criteria relevant to IPE (Wagner et al., 2009). The UofM adaptation of the PIPES included nine attributes, with each attribute being further validated by one or more supporting educational theories and/or evidence-based learning strategies (Oandasan & Reeves, 2005). Table 1 compares the attributes and scoring for the original UofT PIPES instrument with the attributes and scoring for the UofM adapted PIPES instrument. Interprofessional learning opportunities that are case-based, facilitated, offer opportunities for debrief, involve small, interactive groups of learners, and are relevant to real practice received high scores on the adapted PIPES. Points were allocated for each of the nine attributes and summed. Each IPLO was categorized as “Green” for summed scores greater than 55 (highest quality), “Orange” for scores between 45 and 55, or “Red” for scores between 30 and 40. If an IPLO did not address at least two process and two content areas and accrue a minimum of 10 process, 10 content, and a total of 30 points overall, it was labelled “Not IPE.” The adapted PIPES instrument is available on the Initiative website (University of Manitoba Interprofessional Education Initiative, 2014). CJHE / RCES Volume 46, No. 4, 2016 1 Didactic None Number of facilitators from different professions CJHE / RCES Volume 46, No. 4, 2016 1 Number of professions represented in student participants Frequency of interactions across the learning activity <3 Facilitators educated to provide IPE OR 0 5 >2 Interactive 10 2–4 3 ≥ 4:1 Group: facilitator ratio Frequency of interactions across the learning activity Number of professions represented in student group 1 1 No formal training Didactic Facilitator training 5 10 2–4 2–3 50% or more had formal training (2–3):1 >4 >3 100% had formal training 1:1 Large Interactive group discussion Process (How learning) 0 University of Manitoba Level of Interactivity (continued) >4 >3 Some Formal information session 2 Discussion Process (How learning) Choose one of the following: Level of Interactivity Points University of Toronto Table 1. Adaptations to the Original University of Toronto PIPES Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 99 Case-based learning Debrief period with students and facilitators after IPE learning activity Explicit IPE learning outcomes; number of IPE constructs No cases None 1 Case presentation with some application (adjunct to learning activity) Informal debrief; reflection focusing on content 2 Talk/dialogue No debrief with students Debrief No cases None See/hear Explicit IP learning objectives Realistic and authentic learning activity Case-based learning Dedicated case presentation and in-depth dialogue (focus of learning activity) Facilitated debrief; reflection focusing on content and process 3 Do/real life >2 Do/real life Case presentation Case study, indepth dialogue Uniprofes- IP debrief with sional de- students brief with students 1–2 Talk/dialogue Content (What learning) Content (What learning) Realistic and authentic learn- See/hear ing activity (performance based) University of Manitoba University of Toronto Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 100 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 101 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke Data Analysis Data was analyzed and tabulated using descriptive statistics (frequencies, summed scores) by survey year. A single research technician re-scored and re-tabulated all questionnaire responses for both survey years using the adapted PIPES. Tabulated data were sent back to survey respondents for their verification. Questionnaire responses were further validated and modified based on a review of any available course material (e.g., facilitator guides, student manuals, or handouts). Any discrepancies or misinterpretations were discussed between the research technician and the senior investigator to reach consensus. Results In 2009, 11 of the 12 participating academic units (91.7%) responded to the questionnaire, with 11 IPLOs reported. In 2012, 12 of 13 academic units (92.3%) responded to the questionnaire, also reporting 11 IPLOs. Notably, nine of the 11 IPLOs reported in 2009 were not reported in 2012. General Information on Each IP Learning Opportunity Tables 2 and 3 outline an overall description of the IP learning opportunities reported in both survey years. The number of academic units involved in two or more IP learning opportunities increased from six in 2009 to 11 in 2012 (an 83.3% increase). In 2012, Dentistry participated in only one IPLO, and the Physician Assistant Program, which joined the Initiative in 2011, had not yet participated in an IPLO. The number of students involved in any given IPLO in 2009 ranged between eight to 168, with a total of 560 students participating in all IPLOs offered that academic year. This compared to a range of three to 378 students participating in any given IPLO offered in 2012, with a total of more than 1,081 students participating in all IP learning opportunities offered that academic year (a 93% increase). Notably, approximately 2,200 students are enrolled in the 13 participating academic programs, which range between one and four years. The proportion of IPLOs that were embedded into existing courses increased from four of the 11 IPLOs reported in 2009 to six of the 11 IPLOs in 2012. Most other IPLOs offered either voluntary involvement of students or involved a combination of students enrolled in mandatory or elective courses and participating on a voluntary basis. In both survey years, for most of the IPLOs that were embedded within a course, the course had relevance to the “learning common” that was the subject matter of the IPLO. The Initiative uses the term “learning common” to describe the content or the “vector” through which the “process” of developing collaborative knowledge, skills, and attitudes occurs. Learning commons included health and aging, bioethics, gerontology, social determinants of health, ergonomic risks and solutions in dental hygiene, health promotion, and practice education. For 2012, all of the 11 IPLOs had students from different years of education. For example, the social determinants of health IPLO involved students from eight different academic units enrolled in the first to fourth year of their professional programs. Small group sessions and case-based learning were most commonly reported in both survey years. However, it is noteworthy that the diversity of IPE approaches increased from 2009 to 2012, with observation and simulation being introduced as teaching strategies in 2012. CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 102 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke Table 2. Description of Interprofessional Learning Opportunities in 2009 Learning Common Academic Unit (number of students participating) Student Level Mandatory and Embedded (unless specified) Learning Format Year 1 Year 2 Year 1 Year 2 Group project Small group sessions Genetics Counselling Clinical Health Psychology (7) Year 1 Medical Genetics (1) Small group sessions Case based Health and Aging* Kinesiology (15) Nursing (5) Human Ecology (10) Social Work (10) Years 2–4 (elective) Small groups Years 1 & 2 (elective) Case based Years 3 & 4 (elective) Years 1–3 (elective) Gerontology Nursing (16) Occupational Therapy (2) Year 2 Year 2 (voluntary) Practice education Small group Case based Standards of Conduct Kinesiology (76) Recreation Management (40) Physical Education (45) Science (3) Undeclared (4) Year 1 Small group sessions Risk Reduction Strategies Kinesiology & Recreation Management (15) Year 4 (elective) Practice education Case based Various Topics (Pharmacy) Pharmacy (18) Year 4 (elective) Practice education Case based Social Aspects of Human Ecology (10) Nursing (10) Aging* Social Work (10) Arts (5) Others (5) Various years (elective) Small group sessions Case based Surgical Rehabilitation Kinesiology & Recreation Management (10) Medical Rehabilitation (2) Medicine (2) Years 2-4 (elective) Practice education Case based Rehabilitation Techniques Kinesiology & Recreation Management (60) Years 2–4 Practice education Case based Professionalism Kinesiology (14) Recreation Management (9) Years 3&4 (elective) Practice education Case based Social Determi- Dental Hygiene (26) nants of Health Physical Therapy (50) Occupational Therapy (50) Dentistry (30) Years 2-4 Internship CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 103 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke Table 3. Description of Interprofessional Learning Opportunities in 2012 Learning Common Academic Unit (number of students participating) Student level Mandatory and Embedded (unless specified) Learning Format Social Determinants of Health Dental Hygiene (26) Dentistry (29) Occupational Therapy (50) Pharmacy (55) Physical Therapy (50) Respiratory Therapy (15) Year 1 Year 2 Year 1 Year 1 Year 2 Year 2 Small group Group projects Genetics Counselling Clinical Health Psychology (8) Genetics (1) Year 1 & 2 Year 1 & 2 Small group Case based Bioethics Occupational Therapy (50) Physical Therapy (50) Medicine (110) Year 1 Year 2 Year 1 Small group Case based Ergonomic Risks and Solutions in Dental Hygiene Dental Hygiene (52) Occupational Therapy (50) Year 1 & 2 Year 1 (elective) Small group Observation Case based Health Promotion Dental Hygiene (22) Human Ecology (23) Kinesiology (26) Medicine (107) Nursing (87) Occupational Therapy (49) Pharmacy (52) Social Work (12) Year 2 Year 4 Year 4 Year 1 Year 4 Year 2 Year 2 Year 4 Small group Case based Practice Education: Primary Care (Shared Mental Health) Clinical Health Psychology (2) Psychiatry resident (1) PGY1 & PGY2 PGY2 Small group Case based Practice education Practice Education: Specialty Care (Youth with Diabetes) Dental Hygiene (26) Medicine (7) Paediatric residents (11) Pharmacy (3) Nutrition- dietetic interns (3) Year 2 Year 4 PGY1 Year 4 (elective) Year 1 Small group Case based Group projects Practice education Practice Education: Tertiary Care Medicine (4) Nursing (11) Occupational Therapy (5) Pharmacy (4) Respiratory Therapy (3) Social Work (3) Year 3 Year 3 & 4 Year 2 Year 4 Year 3 Year 3 & 4 Small group Case based Practice education CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 104 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke Healthcare Team Challenge Medicine (8) Nursing (5) Human Ecology (3) Physical Therapy (2) Respiratory Therapy (3) Occupational Therapy (1) Pharmacy (1) Recreational Management (1) All (voluntary) Small group Case based Simulated Overnight Hospital Ward Medicine (17) Nursing (25) Pharmacy (6) Year 1&2 Year 2&3 Year 3 (Voluntary for all) Simulation Case based Inner-city Community Care Human Ecology (1–6) Medicine (1–5) Nursing (1–5) Occupational Therapy (~ 4) Pharmacy (6) Year 4 Year 1 or 2 Varied Year 2 Years 1 & 2 (Voluntary for all except OT) Service delivery Case based PIPES Attributes and Scoring The 2012 survey noted substantial improvements in the process and content attributes of the PIPES instrument (Tables 4 and 5). In 2009, two of 11 IPLOs scored as “Not IPE,” with only four meeting the criteria for the highest quality IP learning. In contrast, by 2012, the attributes of all offerings had improved, with none of 11 reported IPLOs scoring as “Not IPE” and eight meeting the criteria for the highest quality IP learning. Process attributes of the PIPES relate to the level of student interactivity, group-tofacilitator ratio, facilitator training in IPE, diversity of disciplines represented in groups, and frequency of interaction. Most notable improvements in the process attributes included a greater number of IPLOs in 2012 meeting the minimum criteria for small group facilitation (group-to-facilitator ratio ≤ 3:1) (10 in 2012 compared to five in 2009) and facilitator training (i.e., at least 50% or more of facilitators receiving formal as opposed to informal or no training) (seven in 2012 compared to two in 2009). Content attributes of the PIPES relate to authenticity of the learning activity, explicit statement of IP learning objectives, use of case-based learning, and opportunities for student debriefs. Most notable improvements in content attributes included a greater number of IPLOs that specified IP learning objectives (seven of the 11 IPLOs offered in 2012 compared to only two of the 11 IPLOs reported in 2009). Further, 10 of the 11 IPLOs reported in 2012 involved interprofessional debriefs among students, compared to only four of 11 IPLOs in 2009. CJHE / RCES Volume 46, No. 4, 2016 CJHE / RCES Volume 46, No. 4, 2016 10 10 10 10 5 10 Social Determinants of Health Various Topics (Pharmacy) Surgical Rehabilitation Rehabilitation Techniques Standards of Conduct Risk-reduction Strategies 0 0 5 0 0 0 0 5 5 10 10 0 5 0 0 0 5 0 0 0 0 0 Formally Trained Facilitators 0 10 0 5 5 10 10 5 10 5 5 0 10 0 0 10 5 10 10 10 10 10 10 5 10 10 10 5 5 10 5 10 10 Disciplines Frequency of Performance Represented Interactions Based? in Students Process Criteria Note: Grey shading denotes IP learning opportunity offered in both survey years. 10 Social Aspects of Aging 10 Professionalism 10 10 Gerontology Health & Aging 10 Level of Group-toInteractivity Facilitator Ratio Genetics Counselling Learning Common Table 4. Total PIPES, Year 2009 0 0 0 0 0 10 0 0 10 0 0 0 0 0 0 0 0 10 10 0 10 10 10 0 10 10 10 0 10 10 10 10 10 Explicit IPE Debrief CaseLearning with based Objectives Students Learning (any situation) Content Criteria Not IPE Not IPE 35 35 45 45 55 60 60 65 65 Total Score Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 105 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 10 10 10 10 10 Health Promotion Inner-city Community Care Practice Education: Primary Care (Shared Mental Health) 10 10 Practice Education: Tertiary Care 10 10 Level of Group-toInteractivity Facilitator Ratio 10 Practice Education: Speciality Care (Youth with Diabetes) Learning Common Table 5. Total PIPES, Year 2012 5 0 10 10 10 Formally Trained Facilitators CJHE / RCES Volume 46, No. 4, 2016 10 5 5 10 10 10 10 5 10 10 Frequency of Performance Interactions Based? (continued) 5 10 10 10 10 Disciplines Represented in Students Process Criteria 0 5 5 10 10 5 10 10 10 10 10 10 10 10 10 Explicit IPE Debrief CaseLearning with based Objectives Students Learning (any situation) Content Criteria 65 70 75 90 90 Total Score Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 106 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 5 10 10 Bioethics Ergonomic Risks and Solutions in Dental Hygiene Social Determinants of Health 5 5 10 0 0 0 Formally Trained Facilitators 10 5 5 5 5 10 Disciplines Represented in Students 5 0 0 10 0 10 5 5 5 5 10 5 Frequency of Performance Interactions Based? Note: Grey shading denotes IP learning opportunity offered in both survey years. 5 0 10 Genetics Counselling 10 10 10 Simulated Overnight Hospital Ward 10 10 10 Level of Group-toInteractivity Facilitator Ratio Healthcare Team Challenge Learning Common Process Criteria 0 5 5 0 10 0 10 10 10 10 10 10 0 10 10 10 10 10 Explicit IPE Debrief CaseLearning with based Objectives Students Learning (any situation) Content Criteria 50 55 55 60 65 65 Total Score Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 107 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 108 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke Exemplar 1: Interprofessional Clinical Placements With a PIPES score of 90 (the maximum achievable), the highest score of all IPLOs reported in 2012, IP clinical placements are highlighted as exemplary. It is important to note that the format of IP clinical placements varied by the mentoring practice environment, thus PIPES scores would also vary between IP clinical placement sites. While allowing for some flexibility in format, senior learners from two or more different professions participating in their “uniprofessional” clinical placements were placed simultaneously in a “collaborative practice and learning environment” that provided formal opportunities for students to learn about, with, and from each other. The “gold standard” educational format was grounded in Kolb’s experiential learning theory, with IP teams cycling through Kolb’s four stages: concrete experience, reflective observation, abstract conceptualization, and active experimentation (Kolb, 1984). The experience began with the IP student team attending a facilitated “setting directions” session, at which time a “patient of the week” and one or more collaborative competencies were selected. Over the “week” (or other timeframe, as appropriate), students participated in an IP shared care planning session and an IP case presentation to the mentoring team (concrete experience); they observed and reflected on their own and their mentoring team’s behaviours around the selected collaborative competency(ies) (reflective observation); and they ended the learning experience with a facilitated debrief (abstract conceptualization). Efforts were made by the mentoring team to engage the IP student team in more than one such opportunity with new patients and in new contexts (active experimentation). These activities were specifically designed to be embedded into existing clinical placements, as students are normally expected to develop uniprofessional care plans for their patients during their traditional clinical placements. The unique feature of this IPLO was that each student shared their uniprofessional care plan for the same patient, and together students created a “shared care plan.” Facilitators at the clinical site who were front-line providers may or may not have had training in IP facilitation. An “IP Clinical Placement Module” was distributed in advance to students and preceptors, outlining learning objectives relative to five collaborative competencies (the manual was developed prior to the publication of the CIHC’s six collaborative competencies) and including a variety of tools to facilitate learning. The module is available on the Initiative website (University of Manitoba Interprofessional Education Initiative, 2014). Exemplar 2: Learning Health Promotion Interprofessionally Applying the PIPES instrument, this IPLO scored 75 and is therefore another highquality IPLO offered within the UofM. The overarching goal of this IPLO is to foster interprofessional groups of students learning about, with, and from each other. Educators developed specific learning objectives on interprofessional team communication and health promotion to guide the IPLO. During the session, students from different health professions applied the population health promotion model to address issues presented in a case study, while developing the skills and attitudes that were foundational to interprofessional team communication. In keeping with the Initiative’s guiding principle of “embed rather than add on,” this half-day session was offered once in the fall semester and once in the winter semester to accommodate the timing of a course that each academic unit taught on health promotion. The fall offering involved approximately 180 students CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 109 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke from dentistry, physician assistant, nursing, social work, and respiratory therapy, and the winter offering involved approximately 450 students from dental hygiene, human ecology, kinesiology and recreation management, medicine, nursing, occupational therapy, physical therapy, pharmacy, and social work. The initial meeting between participating students occurred online and included a selfintroduction. Completing the introduction released online pre-readings, videos, and baseline quizzes on health promotion and interprofessional communication skills. In small, educator-facilitated groups of 10 people, students met face to face for a three-hour casestudy session. The format for the session included discussion of the case as well as specific times for explicitly reflecting and debriefing on the group’s team communication skills. The IPLO was embedded within the relevant courses for all participating academic units. Student participation was mandatory, with a grade assigned for the activity (2.5% for a health promotion quiz and 2.5% for the facilitator’s assessment of team communication). Deans and department heads provided facilitators from their academic unit at a studentto-facilitator ratio of 10:1. All facilitators had participated in at least three hours of training. Discussion Use of the PIPES instrument helped to characterize and compare the changes that had occurred in IPE within the UofM in the 2008–2009 and 2011–2012 academic years. The results of this longitudinal analysis indicated there had been an increase in the number and positive attributes of IPLOs offered between the two survey years, the time period in which the Initiative had been formed and held accountable to its mission to foster the development and implementation of IP learning. At baseline, before the Initiative was established, many course instructors reported educational strategies that brought together students from multiple professions but were typically not based on the definitions, attributes, and theoretical underpinnings of IPE later adopted by the UofM. The increase in the number of mandatory and embedded IPLOs between survey years was also noteworthy, as embedding IP learning into health professional curricula has been identified as one of five key “fundamental elements” that enable sustainable IPE (Lawlis et al., 2014). The date of the baseline survey (2008–2009) coincided with the establishment of the UofM IPE Initiative. In a separate manuscript, we report on how, since 2008, the Initiative has used an adoption model framework entitled Interprofessional Education for Collaborative Patient-Centred Practice (IECPCP): An Evolving Framework (D’Amour & Oandasan, 2005) to guide the strategic implementation of IPE within the UofM and to facilitate the diffusion of IECPCP between the UofM and other organizations and sectors (Grymonpre et al., 2016). It is likely that the systems approach used to advance IECPCP and the simultaneous engagement of stakeholders and change-management strategies at the macro , meso, and micro levels had some influence on the findings of this study. There were a number of limitations with and challenges in conducting this survey. It is notable that nine of the IPLOs reported in 2009 were not re-reported in 2012. Although our study design did not permit us to examine this finding further, reasons were likely multifactorial and included (i) an increased awareness of what is not IPE, (ii) non-survey response due to staffing changes in IP faculty ambassadors, and (iii) IP learning opportunities that may not have been offered consistently every academic year. It is also possible that there was confusion about how to best complete the survey, and reporting bias may also have played a role. Further, the number of practice education IP learning opportunities CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 110 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke identified is likely an underestimate. Although IP clinical placements coordinated through the Initiative were noted as exemplar IPLOs for senior students, there were likely many more clinical environments that were offering IP practice education outside of the Initiative but were not notified about the survey. Further, an important limitation of the PIPES tool is that quantity does not necessarily equal quality; for example, small group learning, although an important IP learning strategy, is only good if effectively facilitated. A final limitation is that we do not have evidence to date that learning opportunities that score higher on the PIPES tool lead to better educational outcomes. A study to evaluate the psychometric properties, including the predictive validity of the PIPES tool, has been planned and will be important in determining whether closer adherence to theoretically derived attributes of IPE does, in fact, lead to predicted changes in attitudes, beliefs, and behaviours. Despite the positive findings in this study, suggesting the successful implementation of many IPLOs at the UofM, this study also revealed areas needing improvement. The IPLOs typically involved learners within IP groups at different levels in their academic program relative to both the learning common and the collaborative competency. Whether students learn best with students who are in the same year of their respective program is not known. As aptly stated by Oandasan and Reeves (2005, p. 34) regarding IPE: “We know many of the ingredients that are needed, but may not be sure how best to mix them together to create effective IPE.” For many academic units, the placement of IPLOs within their longitudinal curriculum has not been strategic. The development of collaborative competence requires advancement of knowledge (cognitive), attitudes (affective), skills and behaviours (psychomotor), and group relationship abilities (social) along a learning continuum within a purposely “scaffolded” curriculum of increasing complexity and varying contexts, environments, and knowledge (D’Eon, 2005). An important next step for these academic units is to map out the most appropriate timing of each IPLO along a learning continuum within its respective longitudinal curriculum. Memorial University’s evaluation of their embedded IPE learning continuum is of particular relevance (Curran et al., 2010). In a longitudinal analysis of students’ attitudes towards IPE and teamwork and their satisfaction with the IPE curriculum, no significant improvements were detected over the three-year study period. Reflecting on the process component of Freeth and Reeves’s (2004) 3P model of learning and Olson and Bialocerkowski’s (2014, p. 242) reconceptualization of IPE as a “process within a system,” the varied teaching strategies, the diversity of professions represented in each IPLO, and the unspecified level of interactivity within the summed instructional contact hours detailed in the Memorial evaluation make it difficult to determine which factors could be improved. We suggest that the use of a standardized tool evaluating IPLOs’ attributes may ease the identification of relevant pedagogical targets. The PIPES score is posed as one standardized, novel dimension of an educational intervention within the nonlinear and complex dynamic of IPE. Analogous to a “pack years” calculation, which quantifies smoking load and is recognized as a strong predictor of negative health outcomes (Masters & Tutt, 2007–2013), with further validation, the concept of “PIPES hours,” quantifying the hours and attributes of IP learning, may be one potentially significant predictor of collaborative competence—the desired educational outcome of an IPE curriculum. Further, the kind of feedback provided by the PIPES can be helpful to guide the design, implementation, and evaluation of IPE curricula. In our setting, instructors initially received feedback (i.e., their scores) after providing information about an IPLO that had already been completed. We CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke 111 observed, however, that several instructors went on to use the PIPES as a guideline when developing new opportunities or modifying existing ones, and we anticipate that in the future, the PIPES framework will be helpful in informing process and outcome evaluations. Concluding Comments This longitudinal survey provides valuable information about the changes seen in the IP learning opportunities offered at the UofM between 2008 and 2012 and suggests that a strategic approach to the implementation and diffusion of IPE using an adoption model framework had some influence on these changes. Although some academic units were offering significantly more opportunities for IPE than others, it was encouraging to learn that all of the responding academic units reported involvement in at least one IPLO, and that the PIPES on IPLOs offered at the UofM had increased between the two survey years. The need for more strategic development and implementation of IPLOs along a learning continuum, especially within the clinical practice environments, was identified. The PIPES served as a useful tool for monitoring IPLOs over time and for informing course instructors on how to better align a given IPLO with educational theory and IPE principles. Further, in terms of the longitudinal evaluation of an IPE curriculum’s effectiveness for achieving its desired educational outcome of collaborative competence, we propose (pending further validation) the concept of “PIPES hours” as one potentially important predictor variable. Acknowledgements The authors would like to acknowledge Dr. David Collins, Dean, Faculty of Pharmacy, University of Manitoba for financial support to conduct and analyze the baseline survey data and Ms. Britt Kural for her contributions to the 2009 Inventory QI report. The contributions of the over 200 IP Faculty Ambassadors, Facilitators, Working Group Chairs, and members are greatly appreciated. The authors would also like to acknowledge Achini Weeraratne for her technical support in the preparation of this manuscript. References Accreditation of Interprofessional Health Education [AIPHE]. (2009). Accreditation of interprofessional health education (AIPHE) principles and practices for integrating interprofessional education into the accreditation standards for six health professions in Canada. Retrieved from: http://www.cihc.ca/files/resources/public/English/AIPHE%20 Interprofessional%20Health%20Education%20Accreditation%20Standards%20Guide_ EN.pdf Aston, S., Mackintosh, S., & Orzoff, J. (2010). Interprofessional education program, Western University of Health Sciences. Journal of Allied Health, 39, 137–138. Blue, A. V. (2010). Creating Collaborative Care (C3), Medical University of South Carolina. Journal of Allied Health, 39, 127–128. Canadian Interprofessional Health Collaborative [CIHC]. (2010). The National Interprofessional Competency Framework. Retrieved from http://www.cihc.ca/files/ CIHC_IPCompetencies_Feb1210.pdf CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 112 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke Centre for the Advancement of Interprofessional Education [CAIPE]. (2002). The definition and principles of interprofessional education. Retrieved from http://caipe. org.uk/about-us/the-definition-and-principles-of-interprofessional-education/ Cook, D. (2004). Models of interdisciplinary learning. Ottawa, ON: Health Canada. Cooper, H., Braye, S., & Geyer, R. (2004). Complexity and interprofessional education. Learning in Health and Social Care, 3, 179–189. Curran, V. R., Sharpe, D., Flynn, K., & Button, P. (2010). A longitudinal study of the effect of an interprofessional education curriculum on student satisfaction and attitudes towards interprofessional teamwork and education. Journal of Interprofessional Care, 24, 41–52. D’Amour, D., & Oandasan, I. (2005). Interprofessionality as the field of interprofessional practice and interprofessional education: An emerging concept. Journal of Interprofessional Care, 19(S1), 8–20. D’Eon, M. (2004). Interdisciplinary learning: Principles and methods. Ottawa, ON: Health Canada. D’Eon, M. (2005). Blueprint for interprofessional learning. Journal of Interprofessional Care, 19(S1), 49–59. Finlayson, M., Egan, M., & Black, C. (1999). Secondary analysis of survey data: A research method with potential for occupational therapists. Canadian Journal of Occupational Therapy, 66, 83–91. Freeth, D., & Reeves, S. (2004). Learning to work together: Using the presage, process, product (3P) model to highlight decisions and possibilities. Journal of Interprofessional Care, 18, 43–56. Grymonpre, R. E., Ateah, C., Dean, H., Heinonen, T., Holmqvist, M., MacDonald, L., . . .Wener, P. (2016). Sustainable implementation of interprofessional education using and adoption model framework. Canadian Journal of Higher Education. Health Force Ontario. (2010). Implementing interprofessional care in Ontario: Final report of the interprofessional care strategic implementation committee. Retrieved from: http://www.healthforceontario.ca/UserFiles/file/PolicymakersResearchers/ipcfinal-report-may-2010-en.pdf Knowles, M. (1980). The modern practice of adult education—from pedagogy to androgogy. Chicago, IL: Follett Publishing. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development (Vol. 1). Englewood Cliffs, NJ: Prentice-Hall. Lawlis, T. R., Anson, J., & Greenfield, D. (2014). Barriers and enablers that influence sustainable interprofessional education: A literature review. Journal of Interprofessional Care, 28, 305–310. MacKenzie, D., & Merritt, B.K. (2013). Making space: Integrating meaningful interprofessional experiences into an existing curriculum. Journal of Interprofessional Care, 27, 274–276. CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 113 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke Masters, N., & Tutt, C. (2007–2013). Calculator. Smoking Pack Years. Retrieved from http://smokingpackyears.com/ Oandasan, I., & Reeves, S. (2005). Key elements for interprofessional education. Part 1: The learner, the educator, and the learning context. Journal of Interprofessional Care, 19(S1), 21–38. Olson, R., & Bialocerkowski, A. (2014). Interprofessional education in allied health: A systematic review. Medical Education, 48, 236–246. doi:10.1111/medu.12290 Reeves, S., Perrier, L., Goldman, J., Freeth, D., & Zwarenstein, M. (2013). Interprofessional education: Effects on professional practice and healthcare outcomes (update). Cochrane Database of Systematic Reviews, 2013(3), 1–47. doi:10.1002/14651858.CD002213.pub3 University of Manitoba Interprofessional Education Initiative. (2014). Tools and resources. Retrieved from http://umanitoba.ca/programs/interprofessional/tools/index.html University of Toronto. (2009). Points for Interprofessional Education Score (PIPES). Retrieved from http://umanitoba.ca/programs/interprofessional/media/Report___ Inventory_2009_Ver_6.pdf Wagner, S. J., Langlois, S., Lowe, M., & Simmons, B. (2009, October). Building an interprofessional education curriculum: Core competency and learning activity integration. Paper presented at the Collaborating Across Borders II IPE Conference, Halifax, NS. Zwarenstein, M., Goldman, J., & Reeves, S. (2009). Interprofessional collaboration: Effects of practice-based interventions on professional practice and healthcare outcomes. Cochrane Database Systematic Reviews, 2009(3), 1–3. doi:10.1002/14651858.CD000072.pub2 Contact Information Ruby Grymonpre College of Pharmacy Rady Faculty of Health Sciences University of Manitoba [email protected] Ruby Grymonpre, BSc (Pharm), PharmD, FCSHP, is a professor, College of Pharmacy, Rady Faculty of Health Sciences, University of Manitoba. Building on over 25 years of experience in “Geriatric Pharmacy,” her current scholarly works relate to interprofessional education (IPE) and interprofessional collaborative practice (IPC), with a particular interest in program evaluation, IP practice education, adoption model frameworks, and health human resource outcomes. She served a seven-year term as the IPE Coordinator for the U of M and as a co-chair for the Accreditation of Interprofessional Health Education project. Most recently, she was appointed to the Board of the Canadian Interprofessional Health Collaborative and the World Coordinating Committee on IPE. Heather Dean, MD, FRCPC, graduated from Medicine at Queen’s University, in Kingston, Ontario, and pursued paediatrics and paediatric endocrinology training in Montréal, Ottawa, and Winnipeg. She was a full-time academic clinician in Winnipeg for 33 years and CJHE / RCES Volume 46, No. 4, 2016 Quantifying Interprofessional Learning / R. E. Grymonpre, H. J. Dean, M. James, P. F. Wener, 114 A. E. Ready, L. L. MacDonald, M. E. Holmqvist, & M. W. Fricke retired in June 2015 as Professor of Pediatrics. She held many leadership positions in the Department of Pediatrics and the Faculty of Medicine, University of Manitoba. Her major research areas are type 2 diabetes in children, the transition of adolescents with diabetes to adult care, and interprofessional education. Heather Dean has now retired. Maria James is a program evaluator with Manitoba Agriculture, Food and Rural Development. She was previously employed as a research technician with the University of Manitoba Interprofessional Education Initiative. She holds a Masters in Family Social Sciences (now Community Health Sciences) from the University of Manitoba. Pamela Wener, PhD(c), OT Reg. (MB), is an associate professor in the Department of Occupational Therapy, College of Rehabilitation Sciences, Rady Faculty of Health Sciences, University of Manitoba. She has been involved in the development of an interprofessional education curriculum at the University of Manitoba and is one of five academics appointed to the Faculty of Health Sciences, Office of Interprofessional Collaboration. She is a doctoral candidate studying interprofessional relationships within collaborative mental health care in primary care settings. A. Elizabeth Ready is a professor in the Faculty of Kinesiology and Recreation Management and the Director of the multi-disciplinary Applied Health Sciences doctoral program, at the University of Manitoba. She obtained her PhD in exercise physiology at the University of Alberta. Her research focuses on the health benefits of physical activity in older adults, the integration of physical activity into the primary healthcare system, and age-friendly communities. Her involvement in curriculum development led to her scholarly interest in interprofessional education and collaborative practice. Laura MacDonald, RDH, Med, PhD(c) is an associate professor with the School of Dental Hygiene, College of Dentistry, Rady Faculty of Health Sciences, University of Manitoba. She has been involved in the development of an interprofessional education curriculum at the U of M and is one of five academics appointed to the Office of Interprofessional Collaboration. Maxine Holmqvist received her PhD in clinical psychology from the University of Saskatchewan, where she was a founding member of the Student Wellness Initiative Toward Community Health, a student-run interprofessional health clinic located in a core neighbourhood of Saskatoon. She is currently an assistant professor and coordinator for interprofessional education in the Department of Clinical Health Psychology, College of Medicine, at the University of Manitoba, where she is the theme lead for health psychology/behavioural medicine in the undergraduate medical curriculum. Maxine Holmqvist teaches and conducts much of her research collaboratively and frequently acts as a mentor for interprofessional student groups. Moni Fricke, PhD, Msc, BMR(PT) is an instructor with the Department of Physical Therapy, College of Rehabilitation Sciences, Rady Faculty of Health Sciences, University of Manitoba. She has been involved in interprofessional curriculum development for both pre-licensure health professional students as well as faculty and clinicians since 2006. Moni Fricke is a licensed physical therapist and holds a doctorate from UM. CJHE / RCES Volume 46, No. 4, 2016
Author
University of Manitoba
Author
University of Manitoba
Author
University of Manitoba
Author
University of Manitoba
Author
University of Manitoba
Author
University of Manitoba
Author
University of Manitoba
Author
University of Manitoba