Zum Hauptinhalt springen

Veterinary Educator Teaching and Scholarship (VETS): A Case Study of a Multi-Institutional Faculty Development Program to Advance Teaching and Learning

Gordon-Ross, Paul ; Smith, Martin H. ; et al.
In: Journal of Veterinary Medical Education, Jg. 47 (2020-11-01), S. 632-646
Online unknown

Veterinary Educator Teaching and Scholarship (VETS): A Case Study of a Multi-Institutional Faculty Development Program to Advance Teaching and Learning 

Content expertise in basic science and clinical disciplines does not assure proficiency in teaching. Faculty development to improve teaching and learning is essential for the advancement of veterinary education. The Consortium of West Region Colleges of Veterinary Medicine established the Regional Teaching Academy (RTA) with the focus of "Making Teaching Matter." The objective of the RTA's first effort, the Faculty Development Initiative (FDI), was to develop a multi-institutional faculty development program for veterinary educators to learn about and integrate effective teaching methods. In 2016, the Veterinary Educator Teaching and Scholarship (VETS) program was piloted at Oregon State University's College of Veterinary Medicine. This article uses a case study approach to program evaluation of the VETS program. We describe the VETS program, participants' perceptions, participants' teaching method integration, and lessons learned. A modified Kirkpatrick Model (MKM) was used to categorize program outcomes and impact. Quantitative data are presented as descriptive statistics, and qualitative data are presented as the themes that emerged from participant survey comments and post-program focus groups. Results indicated outcomes and impacts that included participants' perceptions of the program, changes in participant attitude toward teaching and learning, an increase in the knowledge level of participants, self-reported changes in participant behaviors, and changes in practices and structure at the college level. Lessons learned indicate that the following are essential for program success: (1) providing institutional and financial support; (2) creating a community of practice (COP) of faculty development facilitators, and (3) developing a program that addresses the needs of faculty and member institutions.

Keywords: faculty development; multi-institutional collaboration; case study

Introduction

Professional development in teaching and student learning is vital to the advancement of veterinary education.[1] In the first half of the 1900s, it was viewed that teaching expertise was assured with proficiency in the specific domain.[2],[3] For the most part, faculty development prior to the 1960s provided faculty time and resources to improve their content knowledge.[2],[3] In 1998, Wilkerson and Irby indicated that while there may be an association between teaching ability and content expertise, teaching is a separate skill.[4] Multiple reviews of the literature have demonstrated the effectiveness of faculty development in medical education in improving educator teaching and student learning.[4][12] Until recently, few articles had been published on formalized faculty development in the United States veterinary programs that focus on teaching and student learning specific to veterinary education.[5] Veterinary education in Europe has made great strides in this area with the development of programs such as the Royal Veterinary College's MSc in Veterinary Education (https://www.rvc.ac.uk/study/postgraduate/veterinary-education).

Authors in the area of faculty development across the health care professions, such as Behar-Horenstein et al.,[13] Bell,[5] Chan et al.,[14] Lane and Strand,[15] Steinert et al.,[10] and Steinert et al.,[11] have indicated that there is little preparation of educators in the health care fields for their roles as teachers. The lack of preparation is a common occurrence across higher education but can be particularly problematic when training students for professional careers in health care that require clinical skills.[16] In medical education, the model "see one, do one, teach one" has been a common training technique for students and is often applied to educators whose responsibilities are to train health care professionals.[16],[17] Medical educators often take these positions of teaching with little to no training in the area of education. They are equipped with their experience and expertise in a specific service that "governs how they will instruct and what students need to know, but are often ill-prepared to know how to communicate the information or skills to their students."[18](p.57)

There is evidence that some faculty perceive that they are unprepared for their teaching role and desire more faculty development in the area of teaching and curriculum.[19],[20] In human medical education, multiple authors, such as Leslie et al.,[8] Steinert et al.,[10] and Steinert et al.,[11] have argued that the ever-increasing complexity of care and delivery, new teaching approaches, and the competing demands make faculty development particularly necessary in the health care field. Therefore, professional development opportunities are essential to support inexperienced instructors and provide current resources for even the most experienced educators. Steinert[21] has defined faculty development as "all activities health professionals pursue to improve their knowledge, skills, and behaviors as teachers and educators, leaders and managers, and researchers and scholars, in both individual and group settings."[21](p.4) Steinert and Mann[1] have asserted that faculty development is crucial to promoting "academic excellence, educational innovation, and professional growth of the individual and the institution."[1](p.322) Even with the identified need for robust faculty development in health care education, it has been noted that there is a scarcity of publications describing faculty development in veterinary education and a lack of publications on the effectiveness of the programs that do exist.[5]

In 2011, the deans from five veterinary colleges of the western region of the United States [Colorado State University (CSU), Oregon State University (OSU), University of California, Davis (UCD), Washington State University (WSU), and Western University of Health Sciences (WUHS)], met to identify areas of common interest in which collaboration would allow improved efficiency and effectiveness in addressing institutional needs across the consortium.[22],[23] With the aid of a corporate partner, Zoetis (https://www.zoetis.com), the Consortium of West Region Colleges of Veterinary Medicine's Regional Teaching Academy (RTA) (https://teachingacademy.westregioncvm.org) was formed. The motto of the RTA is "Making Teaching Matter," and its mission is to provide a forum for members of the consortium to "collaborate to develop, implement, and sustain best practices in veterinary and medical/biomedical education in their colleges, and to establish veterinary medical educator/biomedical educator as a valued career track."[22][24] In 2013, the first RTA Biennial Conference was held at OSU in Corvallis, Oregon. The primary goals of the conference were to (1) provide faculty development in the area of teaching and learning, (2) allow for networking of faculty across the institutions, (3) allow attendees to present their scholarly efforts in education and educational research, and (4) identify initiatives for which to form working groups.[25] During the conference, two working initiatives were formed: The Faculty Development Initiative (FDI) and the External Peer Review of Teaching Initiative (EPRTI).[25]

The initial goals of the FDI were "to provide opportunities and resources for training and mentoring of ... faculty with a focus on instructional issues and methods" and to develop a multi-institutional faculty development program to improve teaching.[25] The initial program created from this effort was to be a Teaching and Learning Boot Camp that would "provide a solid grounding in the basic principles and current understanding of teaching and learning." In order to inform the development of the program and ensure it met the needs of the potential participants, a needs assessment survey was distributed to all member institutions in June 2015. The survey deployed was modified from an instrument initially developed for determining the needs of the Western University of Health Sciences College of Veterinary Medicine (WesternU CVM) preceptors using an explanatory mixed process. The process used began with focus group sessions of six to eight individuals from each of the stakeholder groups: students, preceptors, faculty, and administration. The emergent themes were used to develop a needs assessment survey for delivery to the WesternU CVM preceptor population. It was this base survey that was modified to ensure it also captured the needs of basic science disciplines as well as clinical science. All faculty, including college administration, were involved with the teaching of veterinary students. Basic science educators, clinical educators, those with home departments outside the college of veterinary medicine, and those with home departments within the college of veterinary medicine were invited to participate. There were 160 respondents to the survey request. Not all participants answered every question. Of the respondents, 38.0% (57/150) were members of the RTA. The top 10 topics indicated as "useful" to "very useful" were (1) incorporating innovative teaching techniques; (2) giving effective feedback; (3) using feedback to inform your teaching; (4) fostering/instilling intrinsic motivation; (5) filtering content (not trying to teach too much); (6) setting expectations; (7) aligning course objectives, activities, and assessment; (8) course design; (9) exam question writing; and (10) dealing with difficult students (Figure 1).

Graph: Figure 1: All faculty involved with the teaching of veterinary students, regardless of discipline, department, or setting, were invited to participate. There were 160 respondents to the survey request. Not all participants answered every question.

Based on the needs assessment information, the program was developed to include sessions on understanding yourself as a teacher, understanding your students, course design and development, assessment and evaluation, delivering and receiving feedback, and accessing resources and support. The program goal was to "improve the consistency and quality of student education through the use of evidence-based best educational practices."[24] The final program was designated the Veterinary Educator Teaching and Scholarship (VETS) program.[23],[26],[27] The program was aligned with effective strategies for faculty development as described in the literature, including "evidence-informed educational design," "relevant content," "experiential learning and opportunities for practice and application," "opportunities for feedback and reflection," "educational projects," "intentional community building," "longitudinal program design," and "institutional support."[10]

Connections and progression in the development phase of the VETS program was advanced by face-to-face meetings among the FDI members. These meetings occurred approximately every 12 months; that is, at each RTA Biennial Conference and at initiative meetings held in years during which the Biennial Conference was not held. Interim meetings were conducted via Zoom Video Communications, Inc.'s videoconferencing platform (https://zoom.us/). These Zoom videoconference meetings were conducted approximately every 2 to 3 weeks when an upcoming VETS program was to be delivered to every 4 to 8 weeks when no immediate programs were planned. As part of the development, portions of the program were delivered during the 2015 RTA Biennial Conference held at WSU in Pullman, Washington.[23],[26] This pilot delivery of segments of the full program provided essential data for program refinement. After this initial delivery of a segment of the program, the pilot VETS program was presented at OSU on September 9 and 10, 2016.[23],[26] Since the delivery of the pilot VETS program at OSU, the program has been subsequently designated VETS 1.0 and has been delivered prior to the RTA Biennial Conference in 2017 (at CSU in Fort Collins, Colorado) and in 2019 (at UCD in Davis, California).

The overarching purpose of this article is a formative evaluation of the pilot delivery of the VETS program at OSU in September 2016. The purpose of this program evaluation was (1) to explore the outcomes and impacts of the program at the participant institutions, and (2) to explore the lived experiences of the participants of the pilot VETS program. The research questions were:

  • What were the self-reported outcomes and impacts of the pilot VETS program at the individual faculty and institutional levels?
  • What were the experiences of the pilot VETS program participants?
Methods

Study Design

The study used a quantitative and qualitative case study design to conduct a formative program evaluation similar to the process described by Balbach.[28] Program evaluations using case study methodology explore what occurred during the delivery of the program, the outcomes and impacts of the program on the participant (intended and unintended), and how the outcomes and impacts are linked to the program.[28]

The single case study will be used to investigate the research questions through embedded units of analysis. The single case of investigation is the RTA pilot delivery of the VETS program. A case study methodology was chosen to provide a rich understanding of the program from the perspective of the participants and because a case study is the best way to understand what happened during the program.[28] A case study approach allows for the exploration of a "phenomenon within its real-life context."[29](p.23) Both quantitative and qualitative data can be collected, and these multiple sources of evidence increase the validity of the findings and reveal diverse perspectives.[30] This provides for triangulation to answer the research questions through the use of evidence from multiple sources to corroborate the findings.[29]

The VETS program outcomes and impacts were categorized using a modified Kirkpatrick Model (MKM) of program evaluation. Kirkpatrick's model of program evaluation,[31] which has been used and refined by multiple authors,[6],[9][12],[32](p.212) was used as a taxonomic instrument to categorize the outcomes and impacts of the program. Kirkpatrick's model categorizes outcomes and impacts from participant perceptions of the program to broader institutional level outcomes. The MKM framework used in this study is based on the modifications described by Freeth et al.,[6] used by Steinert et al.,[11] modified by Steinert et al.[10], and modified by Stes et al.[12] (Table 1).

Table 1: Modified Kirkpatrick's Model (MKM) levels of program outcomes and impacts

LevelChangeDescription
MKM 1REACTIONPerceptions of the learning experience, its organization, presentation, content, teaching methods, and quality of instruction
MKM 2ALEARNING (change in attitudes)Change in attitudes of participants toward teaching and learning
MKM 2BLEARNING (changes in knowledge)Acquisition of knowledge of concepts, procedures, and principles
MKM 3ABEHAVIOR (self-reported change in behaviors)Self-reported change in behavior (i.e., application of new knowledge and skills)
MKM 3BBEHAVIOR (observed change in behaviors)Observed changes in behavior (i.e., application of new knowledge and skills)
MKM 4ARESULTS (change in the system/ organizational practice)Observed changes in the organization attributable to the educational program
MKM 4BRESULTS (change among the participants' students, residents, or colleagues)Observed improvement in student or resident learning/performance as a result of the educational intervention

2 * Kirkpatrick's Model (MKM) of training program outcomes and impacts[31] as adapted and further refined by Freeth, Hammick,[6] Steinert et al.,[11] and Steinert et al.[10]

Participants

The theoretical population and the accessible population were the same and consisted of all attendees of the pilot VETS program (N = 25). Participants were solicited by RTA members at their respective institutions. Early career educators were targeted; however, the final decision on participants was made by each institution's administration. The participants included 4 from WUHS, 5 from WSU, 4 from CSU, 8 from OSU, and 4 from UCD. The sampling strategy for the quantitative portion of the study constituted a convenience sample. All participants were invited to complete surveys prior to, during, and after the program, thus comprising a retrospective pre-test/post-test. The sampling strategy for the qualitative portion of the study was purposive.

The data presented in this article were collected as part of a formative evaluation of the pilot VETS program, and thus constitute previously collected data. The study was reviewed and approved by the Institutional Review Board of Western University of Health Sciences (# X18/IRB/011).

Data Collection

For this case study, a multipronged, mixed-method approach was used and data were collected from multiple embedded instruments, including periodic entrance and exit surveys for which participants recorded information at the request of the presenter, face-to-face focus groups and interviews, a post-test survey, and a retrospective pre-test/post-test program survey. The entrance and exit surveys were designed to gather the participants' self-reported prior and post-session knowledge and understanding of the various topics presented during the program. The post-program survey was delivered using Qualtrics survey software[1] and consisted of Likert scale and open-ended questions, which provided both quantitative and qualitative data in a post-program survey (Appendix 1). Additionally, questions regarding knowledge of specific topics were presented in a retrospective pre-/post- format in a process similar to that described by Drennan and Hyde.[33]

Follow-up semi-structured interviews and focus groups were conducted via a Zoom videoconference session 4 months after the conclusion of the program in order to gather additional qualitative data via focus group questions (Appendix 2). These data provided a better understanding of the long-term impacts of the program and allowed for further exploration of the participants' experiences, perceptions, and understandings of the sessions and material.

Data Analysis

Quantitative data were analyzed and descriptive statistics were calculated. Inferential statistics were not performed because there was no desire to make inferences about the theoretical population. The qualitative data from the end of the program survey, entrance and exit surveys, and the interview and focus were reviewed, and major themes determined by the consensus of two authors (ABW and MHS).

Pilot VETS Program

The primary goal of the FDI was to design a faculty development program for the benefit of educators, with relevance to those early in their career as faculty. The program was designed based on the findings of Steinert et al.,[10] who identified key features of effective faculty development programs. These key features include "evidence-informed educational design, relevant content, experiential learning, feedback and reflection, educational projects, intentional community building, longitudinal program design, and institutional support."[10](p.1) The pilot VETS program was constructed to foster a culture of evidence-based best practices in teaching by providing professional development in (1) the principles of teaching and learning, (2) outcomes and student assessment, and (3) approaches to student–teacher interactions.

The overarching goal of the pilot VETS program was to advance teaching practices to support student learning. To address this, modules were presented to introduce, demonstrate, and model effective methods in teaching for participants in both clinical and pre-clinical curricula. The pilot VETS program learning objectives were as follows:

  • Participants will be able to list and identify educational concepts and effective teaching skills for both the clinical and didactic setting.
  • Participants will be able to summarize their perspectives on teaching and learning and identify how their perspectives influence their teaching.
  • Participants will be able to recognize, classify, and apply diverse methods of instruction.
  • Participants will be able to apply the concepts of giving and receiving feedback to improve teaching and learning.
  • Participants will be able to understand and apply the concepts of communities of practice to advance their ongoing professional development as an educator.

The 2-day program was delivered at OSU on September 9 and 10, 2016, and was facilitated by 10 members of the FDI using a variety of strategies to model best practices in teaching. All 10 facilitators were experienced educators, with most of them having had advanced training in education ranging from certifications to doctorate degrees in education. Before the program, participants were asked to reflect on their teaching and identify an area of their teaching that they wanted to focus on during the program. Additionally, participants were instructed to outline an educational project that would allow them to apply the concepts they would be learning. Participant projects were to be situated within the context of their current teaching responsibilities to provide a tangible product that could be shared with other faculty members.

During the program, participants were introduced to theories of teaching and learning, perspectives on teaching and learning approaches to instructional design, strategies for assessment, and the importance of using feedback to strengthen instruction; see the Program Agenda: Planned (Appendix 3) and Program Agenda: Delivered (Appendix 4). Sessions were designed to immerse participants in multiple areas of education. Following are brief descriptions of the topics and content introduced during each session.

Learning Theories

Participants were immersed in the concepts of constructivism and experiential learning using an active learning process in which attendees engaged in an activity that involved constructing a flying apparatus. Through this exercise, participants were able to actively solve a problem and later reflect on their experience. This session was designed to immerse participants in constructivist principals, a shift that has occurred in medical education as "the focus is not an objective external reality, but more an active responsibility of the learner to construct their own knowledge based on previous experience, perceptions, and knowledge."[34](p.61)

Teaching Perspectives Inventory

Participants explored their perspectives on teaching and learning using the Teaching Perspectives Inventory (TPI, http://www.teachingperspectives.com/tpi/) developed by Pratt et al.[35],[36] The TPI is in an instrument used to measure an educator's teaching profile based on five perspectives.[37] These five perspectives are categorized as follows: transmission (effective delivery of content), apprenticeship (modeling ways of being), developmental (cultivating ways of thinking), nurturing (facilitating self-efficacy), and social reform (seeking a better society). Each perspective is broken into three sub-scores related to (1) beliefs (about teaching), (2) intentions (to accomplish), and (3) actions (instructional settings). In the pilot VETS program this instrument helped participants explore and reflect on their intentions, beliefs, and actions related to teaching and learning.[36],[38]

Curriculum Design

Participants were introduced to backward design as a tool to inform course and curricular design. Participants engaged in the alignment of learning goals, objectives, and teaching activities by first focusing on three concrete tasks: (1) identifying the desired outcomes of the course or session, (2) identifying the acceptable evidence to determine if the desired outcomes have been achieved, and (3) designing learning activities to aid learners in attaining the desired outcomes.[39]

Giving and Receiving Feedback

Participants explored the process and premise behind feedback. This session allowed them to expand their understanding of giving and receiving feedback in veterinary education in the classroom, clinic, and the workplace. Key aspects of effective feedback were presented, including the need for feedback that is timely, specific, and that focuses on actions that can be improved upon.[40],[41]

Communities of Practice (COPs)

Participants were immersed in a community of practice (COP) while participating in the pilot VETS program. For the pilot VETS program, a COP was considered "a persistent, sustained social network of individuals who share and develop an overlapping knowledge base, set of beliefs, values, history and experiences focused on a common practice and/or mutual enterprise."[42](p.5) The use of COPs as a forum for reinforcing and building on concepts presented were explored. Lieberman and Mace[43] indicated that COPs have "become a worldwide focus for teacher learning."[43](p.79) COPs represent a mode of professional development whereby educators are working with other educators, rather than experts working on educators.[44]

Documenting Your Teaching

Participants were exposed to the process of documenting their growth and achievements in teaching. A representative from the EPRTI presented a set of instruments developed to document aspects of Boyer's scholarship of teaching.[45](p.147) Topics, content, and activities presented in the pilot VETS program were referenced throughout this presentation to demonstrate how participation in the program and the individual outcomes from the program could be documented to illustrate scholarship in teaching.

Reflective Practice

Participants experienced aspects of reflective practice and its documentation as a form of scholarly teaching (see the preceding subsection, Documenting Your Teaching). Multiple opportunities were provided for discussion about the content of sessions and reflection on how the lessons learned could inform future teaching; this included the processes of reflecting in practice and reflecting on practice.[46][48]

To demonstrate reflection in and on practices during the pilot VETS program, the facilitators informed the participants that they would be gathering information and feedback throughout the program and adjusting the itinerary of the remaining sessions based on the needs of the participants. The first reflection occurred during lunch on day one, September 9, 2016, when the facilitators briefly discussed the progression of the program and determined there was no need to alter the current course of action. No changes were made based on this discussion. This same conclusion was made during a second reflection exercise conducted after the first day.

During opening comments on day two, a small group of participants expressed an elevated level of anxiety and frustration. The program facilitators quickly adjusted the opening session for the second day to address the concerns (see Appendix 4). Although not explicitly stated, the consensus of the presenters was that the frustration came from the program's heavy focus on theory rather than on the application of concepts with examples that participants could immediately implement at their home institutions. In response to these concerns, the original session for day two was modified by tying in the previous day's session on writing learning objectives to learning activities and assessment, and using a concrete example to clarify the application of principles of course alignment. The remainder of the program was delivered as planned.

Outcomes and Impacts

The formative assessment of the pilot delivery of the VETS program focused on multiple aspects of the program, ranging from participants' enjoyment and engagement in the program to impacts on participants' skills, behaviors, and attitudes that were sustained over time. Thus, the data were examined for evidence of changes at multiple levels of the modified Kirkpatrick framework of outcomes and impacts (Table 1).[6],[10][12] Outcomes and impacts of the pilot VETS program included MKM levels 1, 2A, 2B, 3A, and 4A.

Usefulness and Intent to Implement VETS Program Topics

After the VETS program, participants were asked to provide feedback on their experiences via the post-program survey. The survey response rate was 88% (22/25). Participants were asked to rate how valuable 12 of the VETS topics (Appendix 1 and Appendix 4) were on a 5-point Likert scale (ranging from 1 = not valuable to 5 = very valuable). The top five topics participants identified as valuable to very valuable were (1) alignment between goals, instruction, and assessment (86.4%, 19/22); (2) writing learning objectives (86.4%, 19/22); (3) using student feedback to reflect on teaching (86.4% 19/22); (4) assessment for learning (77.3%, 17/22); and (5) getting and giving feedback (72.7%, 16/22). Participants were asked to rate how likely they were to integrate something from the 12 topics presented during the pilot VETS into their teaching during the Fall 2016 semester. They responded on a 5-point Likert scale (ranging from 1 = not likely to 5 = very likely). The top five topics participants indicated they were likely or very likely to integrate into their teaching were (1) alignment between goals, instruction, and assessment (86.4%, 19/22); (2) writing learning objectives (86.4%, 19/22); (3) using student feedback to reflect on teaching (86.4% 19/22); (4) assessment for learning (77.3%, 17/22); and (5) communities of practice (72.7%, 16/22). These findings are consistent with outcomes and impacts at MKM level 1 (Table 1).

Participant Knowledge of Educational Concepts and Pedagogy

Participants reported that the pilot VETS program supported gains in their knowledge of educational concepts and pedagogy consistent with MKM outcome and impact levels 2A and 2B (Table 1). This included advances in their understanding of assessment as a tool for learning, strategies for promoting student-centered learning, writing learning objectives, and a broader awareness of the body of literature supporting best practices in teaching and learning. For example, one participant explained:

During my time at the [pilot VETS program], I began to see that education is an art form and a science. It helped me understand that I needed to better familiarize myself with pedagogy and how to teach. Yes, I have written a teaching philosophy statement, but did not notice at the time that what I really meant was quite generic.

Expanding on the idea of student-centered instruction, another participant stated:

[I gained] exposure to the education research and learned that there are reasons to change practice.

Reflecting on the role of the teacher in the classroom, one participant explained:

[The VETS program] has put [an emphasis on the] concept that I play a big role in student learning. I am realizing that I need to be more flexible as I teach.

These types of shifts in perspective regarding the role the educator plays in student learning were evident across participants with 72.7% (16/22) of the participants reporting their understanding of their teaching perspective as "low" or "none" prior to participating in the program (Figure 2). After the program, participants reported gains in their understanding with 86.4% (19/22) participants describing their understanding of their teaching perspectives as "moderate," "high," or "advanced." Similarly, survey data demonstrated improvement in participants' understandings of assessment as a tool for learning (Figure 3). Twelve of the 22 (54.4%) participants reported their knowledge of assessment before the program as "low," while only 4.5% (1/22) reported their understanding of assessment as "low" after the program. Similar gains in understanding were demonstrated in the areas of student-centered learning, asking questions and using wait time, guided inquiry, and writing learning objectives. These findings are consistent with outcomes and impacts at MKM levels 2A and 2B (Table 1).

Graph: Figure 2: Participants' perceived retrospective pre-/post- level of knowledge in the area of their teaching perspectives

Graph: Figure 3: Participants' perceived retrospective pre-/post- level of knowledge in the area of assessment for learning

Participant Confidence to Explore Diverse Methods of Instruction

Participants reported that the pilot VETS program promoted confidence and empowered them to explore alternatives to didactic teaching. These findings are consistent with outcomes and impacts at MKM level 2A (Table 1). For example, one participant explained that one of the benefits of the pilot VETS program was:

The permission that it gave everyone to feel more [open] to experiment, and frankly, to fail. We try to encourage students to include failure, and we need to do the same thing as instructors. It might fall flat, but that is okay.

Another participant reported:

I took a risk—regardless of the format of the evaluation, if you don't do the follow-up and discuss it at great length, you lose a lot of learning. So, I flipped the exam, so they were taking it as a take-home and then at the scheduled exam time they went over the exam ... [S]tudent feedback was largely positive.

Participants explained that they were motivated to reduce content and add alternative teaching methods, to incorporate "high risk" teaching strategies because they are the "things that students have loved," and that they have become more flexible in giving students "the option to determine what they want to do." These types of approaches resulted in "all kinds of interesting talks and topics."

Evidence for participant confidence and empowerment was apparent from participant self-reports of their intent to integrate a wide variety of the topics, concepts, and strategies explored during the VETS program into their teaching (Table 2).

Table 2: Likelihood of integrating teaching strategies by participants

TopicPercent
Alignment between goals, instruction, and assessments86%
Wait time/asking questions86%
Writing learning objectives82%
Reflecting on your teaching/student feedback82%
Assessment for learning82%
Communities of practice82%
Getting and giving feedback77%
Student-centered learning77%
External review of teaching/writing a teaching vitae73%
Teaching perspectives55%
Learning cycle55%
Guided inquiry/funneling50%

3 * Indicates the percentage of responses indicating the inclusion of the topic as 4 = likely or 5 = very likely on a Likert scale (ranging 1 to 5).

Changes Catalyzed in Actual Classroom, Clinic, and Laboratory Teaching

Participant confidence and empowerment to explore alternatives to didactic teaching resulted in concrete changes to actual classroom, clinic, and laboratory teaching. These findings are indicated by self-reported changes in behavior (MKM level 3A, Table 1).

For example, one participant integrated student-driven/flipped classroom teaching methods into a large existing basic science lecture course. After reviewing the pertinent details of a case, students were asked to work in groups to choose one or two antimicrobials that "would be sensible for use" and be prepared to defend their choices. Students then shared their ideas via clicker-type software and their ideas were used to explore and expand on the subject during the lecture portion of the class.

A second participant rewrote her course objectives and aligned all her assessment items to the new objectives. During this process, she reconsidered each objective in terms of (1) content, and (2) verb, and ensured that assessment items matched the objectives in terms of these two items. This process resulted in a spreadsheet capturing this alignment for an entire course.

A third participant revised the entire structure of a rounds-type course to address differences in student background and expertise, as a result of having both clinical neurology and anatomic pathology residents in the same course. The course was modified so that instead of interpreting slides in front of the entire class, residents could review slides at a multi-headed microscope in a low-stress, casual environment. In this manner, the basic aspects of lesions could be discussed without (perceived) fear of judgment from anatomic pathology residents. The participant stated:

This has been immensely successful, as the feedback I have received from residents has been uniformly positive, and has even stimulated regular attendance by neurology faculty, which was rare previously.

Other concrete actions taken by participants included revising syllabi, writing teaching philosophies, and redesigning course materials.

Support for the Creation of Authentic Communities of Practice

Participants reported that the pilot VETS program supported their understanding of the concept of COPs, with nearly all participants reporting limited understanding at the start of the program and moderate-to-high understanding after the program (Figure 4). This indicates outcomes and impacts at MKM level 2A (Table 1).

Graph: Figure 4: Participants' perceived retrospective pre-/post- level of knowledge in the area of communities of practice

During the program, participants described that working with educators from multiple institutions was a highlight of their experience. Participants felt that these interactions supported them in expanding on and exploring ideas and concepts developed from the program. They also reported relief in feeling that they were not alone in the challenges they face at their own institutions. One participant explained that she benefited from "talking to other groups about what they are teaching, sharing of ideas, and realizing that we have the same problems."

Participants also described the benefits of the COPs that formed at their home institutions following the VETS program. Although participants found it difficult to meet frequently with their entire COP, they felt that sharing ideas and updating their colleagues on changes that they were making to their classes was beneficial. At one institution, participants developed a COP focused on issues related to teaching first-year veterinary students. The invitation to participate was extended to all faculty teaching within the first year, with the goal being to "support each other when difficulties arise, to re-engage the passion of faculty, and let us get to know each other." These findings are consistent with the changes at the college level with the formation of a COP, at least at one institution (MKM level 4A, Table 1).

Long-Term Program Topics Implementation

Follow-up focus group interviews with participants were conducted via Zoom videoconference in November and December 2016, and they revealed that many participants incorporated VETS program topics into their teaching. Concepts incorporated included (1) creating lecture and laboratory objectives that align with assessments, (2) integrating a wider variety of question types into assessments, (3) incorporating varied student response options (e.g., electronic voting, whiteboards), (4) implementing alternative assessment strategies (e.g., partial group assessments), (5) soliciting feedback from students immediately following lectures, and (6) shifting from teacher-driven to student-driven instruction (e.g., flipping the classroom). These findings are consistent with outcomes and impacts at MKM levels 2A, 2B, and 3A (Table 1).

Many participants described how the pilot VETS program motivated them to try innovative approaches in their classrooms:

[The program] sparked the desire that I want to keep going, I want to get better.

I feel empowered and motivated to help students.

There is a science to educating people.

I was struck with the concept that interaction with students is better than just talking at them.

Just because I have a specific lecturing style that I prefer does not mean that I need to stay loyal to that—especially with active learning.

The benefits of interacting with peers from across institutions were acknowledged by most participants and many indicated that the program resulted in:

... the opportunity to break into small groups and to hear their opinions, challenges, and solutions.

... a feeling of solidarity. I'm not in this all by myself; I'm not the only one with these concerns.

Lessons Learned

Designing and implementing a faculty development program is a high-effort task, one for which there is a high need in veterinary education.[5] The lessons learned from the FDI, development of the VETS program, and delivery of the program can be placed in three categories. These categories relate to (1) the challenges of financial and institutional support, (2) the challenges associated with creating a cohesive group of faculty who are spread across five distant and distinct institutions, and (3) the challenges of developing a program that addresses the needs of all member institutions.

Financial and administrative support is critical for the success of such an endeavor. The partnership with Zoetis, in addition to the financial support of the member institutions, was crucial in ensuring the FDI was provided the time and resources to develop the VETS program. One of the greatest challenges for the FDI members, as for all members of the RTA, is the ability to manage the time commitments of the member's primary position duties at their respective institutions and the time commitment to be a productive member of the RTA. At this time, there is no standardized faculty full-time equivalent (FTE) allotted for RTA activities. This produces a tension between time commitments for individual faculty and highlights the need for administrators at RTA member institutions to create structural changes in the reward systems at their institution. As indicated by Bolman and Deal,[49](p.526) any significant organizational change, such as the increased emphasis on teaching and learning currently taking place as part of the RTA, should be accompanied by structural changes at the institutional level. Some of these needed structural changes are being addressed by the two other RTA initiatives, the EPRTI and the more recently developed Local Peer Observation of Teaching Initiative (LPOTI). The EPRTI has created "an evidence-based template for a teaching dossier"[22] that allows educators to document and highlight their teaching activities in order to emphasize this component of the promotion & tenure requirements that have traditionally been ill-considered in veterinary education. The LPOTI has developed instruments to be used for peer observation of teaching based on best practices. These instruments can be used to obtain both formative and summative feedback that can help educators improve and enhance teaching in a multitude of settings. All three RTA initiatives work together to highlight evidence-based teaching and learning and the value it brings to institutions of veterinary education.

A multi-institutional working partnership, such as the FDI, is a difficult endeavor to create and sustain. One of the significant barriers to the success of such an endeavor is the distance between the members of the FDI members. The five original member institutions—CSU, OSU, WSU, UCD, and WUHS—are spread across four different states, with the closest being approximately 424 miles apart (UCD and WUHS) and the farthest, 1,109 miles apart (WSU and WUHS). The group found it challenging to find time to meet and prioritize the work. In the program development phase, it was found that face-to-face meetings were essential to creating and sustaining a cohesive group of faculty members. With videoconferencing solutions currently available, virtual meetings are easy to organize and conduct, but the FDI members found that without the firm sociocultural foundation established in face-to-face meetings, virtual meetings were less effective in fostering the needed connections and, therefore, less productive. Face-to-face meetings were essential to providing dedicated time for members to focus on program development. In the end, the FDI has been successful in creating a COP across the five institutions, "a persistent, sustained social network of individuals who share and develop an overlapping knowledge base, set of beliefs, values, history, and experiences focused on a common practice and/or mutual enterprise."[50](p.495) The FDI members engage in the professional development of themselves in the delivery of the VETS program for others; where educators are working with other educators, rather than experts working on educators.[44]

Developing a program to meet the needs of faculty from the five institutions is also key to the success of the program.[10],[11] In this area, the needs assessment survey results were essential. Additionally, flexibility is vital when developing such a program. The feedback received from the entrance and exit surveys allowed the facilitators to make slight adjustments depending on the participants' learning experiences. The shift in the second day of the program was critical to regaining momentum, interest, and trust from the participants. On-the-fly program modifications can be challenging and require facilitators who are knowledgeable in these various aspects of teaching and learning. Thus, considerable attention must be given to selecting a diverse group of knowledgeable facilitators.

Results of this formative program evaluation are impacted by the number of participants and how they were selected. The relatively small number of participants may limit the strength of the results; however, the goal was not to generalize the results to the theoretical population, but rather use them to inform program improvement. The non-random, purposive selection of participants from each institution may have biased the results of the evaluation. While it may not be feasible to deliver the program using a random sample of participants, as the program grows it will be beneficial to continue to re-evaluate for program improvement. While use of a retrospective pre-test/post-test has its own limitations (e.g., recall bias, socially desirable responses), this self-report survey method is an efficient and inexpensive way to measure effectiveness.[33],[51]

Conclusions

In considering the purpose of this study, this formative program evaluation provided information about the lived experiences of the participants and the program's outcomes and impacts at the individual, department, and institutional level. Additionally, information was gathered that provided formative feedback to inform continued programmatic improvement. Despite the challenges discussed, this article provides the support that multi-institutional faculty development is not only feasible but effective in promoting changes at the college and institutional level. Essential items for the success of such an initiative include financial support and structural changes in the rewards and incentive systems at individual institutions to ensure faculty are supported and rewarded. Additionally, success is fostered by the organic development of a COP of faculty development facilitators. Finally, the faculty development program must meet the real needs as well as the perceived needs of participants.

Acknowledgments

The VETS program was the result of a strategic initiative of the Consortium of West Region Colleges of Veterinary Medicine: Colorado State University (CSU), Oregon State University (OSU), University of California, Davis (UCD), Washington State University (WSU), and Western University of Health Sciences (WUHS). Members of the Faculty Development Initiative contributed to the development and delivery of the VETS program. Those who contributed to the article are listed as authors. Members during the time when the pilot VETS program was delivered and data were collected include Karyn Bird (OSU) Beth Boynton (WUHS), Bonnie Campbell (WSU), Julie Cary (WSU), Munashe Chigerwe (UCD), Kristy Dowers (CSU), Laurie Fonken (CSU), Samantha Gizerian (WSU), Paul Gordon-Ross (WUHS), Suzie Kovacs (WUHS), Steve Lampa (WSU), Ohad Levi (WUHS), Peggy Schmidt (WUHS), Martin Smith (UCD), Jane Shaw (CSU), and Andrew West (CSU). Rachel Halsey (WSU) provided administrative support. Corporate partner Zoetis contributed to funding the VETS program.

Appendix 1: Post-Program Survey

Teaching Academy of the Consortium of West Regions Colleges of Veterinary Medicine 2016 Pilot VETS Program Post-Program Survey

Question 1 – A number of topics have been discussed during the VETS Program. Please indicate how valuable you felt each of the topics was to you as a teacher on a scale of 1–5 (1 = not valuable, 5 = very valuable).

Table A1: 2016 pilot VETS Program post-program survey question 1, "Please indicate how valuable you felt each of the topics was to you as a teacher on a scale of 1–5 (1 = not valuable, 5 = very valuable)."

1 not valuable2345 very valuable
Assessment for learningmmmmm
Teaching perspectivesmmmmm
Student-centered learningmmmmm
Wait time/asking questionsmmmmm
Guided inquiry/funnelingmmmmm
Reflecting on your teaching/student feedbackmmmmm
Learning cyclemmmmm
Alignment between goals, instruction, and assessmentsmmmmm
Writing learning objectivesmmmmm
Community of practicemmmmm
Getting and giving feedbackmmmmm
External review of teaching/writing a teaching vitaemmmmm
VETS = Veterinary Educator Teaching and Scholarship

Question 2 – Please indicate how likely you are to integrate ideas from the following topics into your teaching this academic year (1 = not likely, 5 = very likely).

Table A2: 2016 pilot VETS Program post-program survey question 2, "Please indicate how likely you are to integrate ideas from the following topics into your teaching this academic year (1 = not likely, 5 = very likely)."

1 not likely2345 very likely
Assessment for learningmmmmm
Teaching perspectivesmmmmm
Student-centered learningmmmmm
Wait time/asking questionsmmmmm
Guided inquiry/funnelingmmmmm
Reflecting on your teaching/student feedbackmmmmm
Learning cyclemmmmm
Alignment between goals, instruction, and assessmentsmmmmm
Writing learning objectivesmmmmm
Community of practicemmmmm
Getting and giving feedbackmmmmm
External review of teaching/writing a teaching vitaemmmmm
VETS = Veterinary Educator Teaching and Scholarship

Question 3 – We recognize that for many of the topics that were discussed during the program, you likely had some level of understanding about the topic prior to the session. We hope that after the session, your level of understanding was strengthened in some way. Please help us understand how your experience in the VETS Program may or may not have impacted your understanding by circling responses below for each topic.

Table A3: 2016 pilot VETS Program post-program survey question 3, "Please help us understand how your experience in the VETS Program may or may not have impacted your understanding by circling responses below for each topic."

Understanding BEFORE discussionUnderstanding AFTER discussion
NoneLowModerateHighAdvancedNoneLowModerateHighAdvanced
Assessment for learningmmmmmmmmmm
Teaching perspectivesmmmmmmmmmm
Student-centered learningmmmmmmmmmm
Wait time/asking questionsmmmmmmmmmm
Guided inquiry/funnelingmmmmmmmmmm
Reflecting on your teaching/student feedbackmmmmmmmmmm
Learning cyclemmmmmmmmmm
Alignment between goals, instruction, and assessmentsmmmmmmmmmm
Writing learning objectivesmmmmmmmmmm
Community of practicemmmmmmmmmm
Getting and giving feedbackmmmmmmmmmm
External review of teaching/writing a teaching vitaemmmmmmmmmm
VETS = Veterinary Educator Teaching and Scholarship

Question 4 – Please provide any ideas you have for how the VETS Program might be able to support you in your teaching during this academic year?

Question 5 – During the VETS Program, faculty presenters conducted sessions using a number of different strategies, activities, formats, etc. Please let us know which approaches, if any, you found to be the most helpful.

Question 6 – Please list any topics that were not discussed during the VETS Program that you would have liked us to address?

Question 7 – Identify your breakout group in which you participated.

□ Curriculum Development    □ Clinical/Preceptorship

Question 8 – Describe what benefits, if any, you derived from your participation in this group.

Question 9 – Additional Comments:

Appendix 2: Focus Group Questions

VETS Program Focus Group Questions

Fall 2016

  • What are some of the key takeaways from the VETS program?
  • Are there some concrete outcomes or philosophical shifts that resulted from your participation in the VETS program?
  • Looking back, what were the most helpful components of the VETS program?
  • Can you talk about some of the values of the Community of Practice at your home institution?
  • Are there any other impacts that the VETS program had on your teaching this semester?
Appendix 3: Program Agenda—Planned

Friday, September 9, 2016

• 8:00 AM Welcome and Introductions

• 8:30 AM Module 1

  • Learning Theories – Constructivism and Experiential Learning
  • Teaching Perspectives Inventory – Strength Training Discussion (pre-workshop homework)
  • Understanding Yourself as a Teacher – Reflective Practice, Inquiry and Experiential Learning, and Pedagogical Approach
  • Understanding Your Students – Generations, Student Development, and Learning Styles
  • Module 1 Wrap-up – Outcomes for the session
  • 11:30 AM Application Activity – Project Work
• 12:00 PM LUNCH

• 1:00 PM Reflection and Discussion

• 1:30 PM Module 2

  • Learning Objectives to Instructional Methods – Writing Learning Objectives
  • Learning Opportunities/Instructional Design – Design/Alignment, Objectives, Technology, Other Resources, Working Efforts on Projects (small group work), and Strength Training Discussion (pre-workshop homework)
  • Module 2 Wrap-up – Outcomes for this session

• 5:00 PM Social gathering

Saturday, September 10, 2016

• 8:00 AM Module 2

  • Assessment: Beyond Multiple Choice – Theory and Methods, Biases, Strength and Weakness of Assessments Forms, Assessment Technology
  • 11:00 AM Delivering and receiving feedback Feedback from students
• 12:00 PM LUNCH

• 1:00 PM Reflection

• 1:30 PM RTA Initiatives: External Review of Teaching and Local Peer Review

• 2:30 PM Work on Project

• 5:00 PM END OF SESSIONS

Appendix 4: Program Agenda—Delivered

Friday, September 9, 2016

• 8:00 AM WELCOME AND INTRODUCTIONS

• 8:30 AM Module 1

  • Learning Theories – Constructivism and Experiential Learning
  • Teaching Perspectives Inventory – Strength Training Discussion (pre-workshop homework)
  • Understanding Yourself as a Teacher – Reflective Practice, Inquiry and Experiential Learning, and Pedagogical Approach
  • Understanding Your Students – Generations, Student Development, and Learning Styles
  • Module 1 Wrap-up – Outcomes for the session
  • 11:30 AM Application Activity – Project Work
• 12:00 PM LUNCH

• 1:00 PM Reflection and Discussion

• 1:30 PM Module 2

  • Learning Objectives to Instructional Methods – Writing Learning Objectives
  • Learning Opportunities/Instructional Design – Design/Alignment, Objectives, Technology, Other Resources, Working Efforts on Projects (small group work), and Strength Training Discussion (pre-workshop homework)
  • Module 2 Wrap-up – Outcomes for this session
• 5:00 PM END OF SESSIONS

• 5:00 PM Social gathering

Saturday, September 10, 2016

• 8:00 AM OPENING COMMENTS

• 8:15 AM Module 2 (continued)

  • Backwards Design – Design/Alignment, Outcomes, Objectives, and Activities
  • Assessment: Beyond Multiple Choice – Theory and Methods, Biases, Strength and Weakness of Assessments Forms, Assessment Technology
  • Delivering and Receiving Feedback – Feedback from students and to students
• 12:00 PM LUNCH

• 1:30 PM RTA Initiatives: External Review of Teaching and Local Peer Review

• 2:30 PM Group discussion of projects in the CoP groups

• 4:30 PM Lessons Learned

• 5:00 PM END OF SESSIONS

Conflict of Interest

The authors have no conflicts of interest to declare.

Note 1 Qualtrics International Inc., Provo, UT, USA, http://www.qualtrics.com References Steinert, Y, Mann, K. Faculty development: principles and practices. J Vet Med Educ. 2006;33(3):317–24. https://doi.org//10.3138/jvme.33.3.317. Medline:17035200 2 Gaff, JG, Simpson, RD. Faculty development in the United States. Innov High Educ. 1994;18:167–76. https://doi.org/10.1007/BF01191111. 3 Lewis, KG.Faculty development in the United States: a brief history. Int J Acad Dev. 1996;1(2):26–33. https://doi.org/10.1080/1360144960010204. 4 Wilkerson, L, Irby, DM. Strategies for improving teaching practices: a comprehensive approach to faculty development. Acad Med. 1998;73(4):387–96. https://doi.org/10.1097/00001888-199804000-00011. Medline:9580715 5 Bell, CE.Faculty development in veterinary education: are we doing enough (or publishing enough about it), and do we value it?. J Vet Med Educ. 2013;40(2):96–101. https://doi.org/10.3138/jvme.0113-022R. Medline:23709106 6 Freeth, D, Hammick, M, Koppel, I, et al. A critical review of evaluations of interprofessional education. Occasional paper no. 2. London, UK: LTSN– Centre for Health Sciences and Practice; 2002. 7 Hendricson, WD, Anderson, E, Andrieu, SC, et al. Does faculty development enhance teaching effectiveness?. J Dent Educ. 2007;71(12):1513–33. Medline:18096877 8 Leslie, K, Baker, L, Egan-Lee, E, et al. Advancing faculty development in medical education: a systematic review. Acad Med. 2013;88(7):1038–45. https://doi.org/10.1097/ACM.0b013e318294fd29. Medline:23702523 9 Sorinola, OO, Thistlethwaite, JE. A systematic review of faculty development activities in family medicine. Med Teach. 2013;35(7):e1309–18. https://doi.org/10.3109/0142159X.2013.770132. Medline:23464818 Steinert, Y, Mann, K, Anderson, B, et al. A systematic review of faculty development initiatives designed to enhance teaching effectiveness: a 10-year update: BEME guide no. 40. Med Teach. 2016;38(8):769–86. https://doi.org/10.1080/0142159x.2016.1181851. Medline:7420193 Steinert, Y, Mann, K, Centeno, A, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME guide no. 8. Med Teach. 2006;28(6):497–526. https://doi.org/10.1080/01421590600902976. Medline:17074699 Stes, A, Min-Leliveld, M, Gijbels, D, et al. The impact of instructional development in higher education: the state-of-the-art of the research. Educ Res Rev. 2010;5(1):25–49. https://doi.org/10.1016/j.edurev.2009.07.001. Behar-Horenstein, LS, Zafar, MA, Roberts, KW. Impact of faculty development on physical therapy professors' beliefs. J Fac Dev. 2012;26(2):37–46. Chan, LK, Yang, J, Irby, DM. Application of the one-minute preceptor technique by novice teachers in the gross anatomy laboratory. Anat Sci Educ. 2015;8(6):539–46. https://doi.org/10.1002/ase.1515. Medline:25573139 Lane, IF, Strand, E. Clinical veterinary education: insights from faculty and strategies for professional development in clinical teaching. J Vet Med Educ. 2008;35(3):397–406. https://doi.org/10.3138/jvme.35.3.397. Medline:19066357 Merritt, C.Jack of all trades, masters of one?. West J Emerg Med. 2018;19(1):7–10. Epub 2017 Dec 5. https://doi.org/10.5811/westjem.2017.10.36890. Medline:29383049 Hashimoto, DA, Bynum, WE, Lillemoe, KD, et al. See more, do more, teach more: surgical resident autonomy and the transition to independent practice. Acad Med. 2016;91(6):757–60. https://doi.org/10.1097/ACM.0000000000001142. Medline:26934694 Glicken, AD, Merenstein, GB. Addressing the hidden curriculum: understanding educator professionalism. Med Teach. 2007;29(1):54–7. https://doi.org/10.1080/01421590601182602. Medline:17538835 Forbes, MO, Hickey, MT, White, J. Adjunct faculty development: reported needs and innovative solutions. J Prof Nurs. 2010;26(2):116–24. https://doi.org/10.1016/j.profnurs.2009.08.001. Medline:20304379 Haden, NK, Chaddock, M, Hoffsis, GF, et al. Preparing faculty for the future: AAVMC members' perceptions of professional development needs. J Vet Med Educ. 2010;37(3):220–32. https://doi.org/10.3138/jvme.37.3.220. Medline:20847330 Steinert, Y. In: Steinert, Y, editor. Faculty development in the health professions: a focus on research and practice. New York: Springer; 2014. Teaching Academy Consortium of West Region Colleges of Veterinary Medicine. The Teaching Academy of the Consortium of West Region Colleges of Veterinary Medicine [homepage on the Internet]. Pullman, WA: Teaching Academy Consortium of West Region CVM; 2019 [cited 2019 Jul 8]. https://teachingacademy.westregioncvm.org. Teaching Academy Consortium of West Region Colleges of Veterinary Medicine. 2015 executive report [Internet]. In: Proceedings of the 2nd Biennial Summer Conference: Building a culture of excellence in teaching & learning; 2015 Jul 22–24; Pullman, WA. Pullman, WA: Teaching Academy Consortium of the West Region CVM; 2015 [cited 2019 Jul 8]. Available from: https://s3.wp.wsu.edu/uploads/sites/1358/2018/06/2015%5f Biennial_Summer-Conference.pdf. Teaching Academy Consortium of West Region Colleges of Veterinary Medicine. Teaching Academy newsletter. Pullman, WA: Teaching Academy Consortium of the West Region CVM; 2014 [cited 2019 Jul 8]. Available from: https://s3.wp.wsu.edu/uploads/sites/1358/2018/06/2014_RTA-Fall-Newsletter.pdf. Teaching Academy Consortium of West Region Colleges of Veterinary Medicine. 2013 Summer Conference: Making the Teaching Academy matter; 2013 Jul 24–26; Corvallis, OR. Teaching Academy Consortium of West Region Colleges of Veterinary Medicine. Teaching Academy newsletter. Pullman, WA: Teaching Academy Consortium of the West Region CVM; 2015 [cited 2019 Jul 8]. Available from: https://s3.wp.wsu.edu/uploads/sites/1358/2018/06/2015_RTA-Sping-Newsletter.pdf. Teaching Academy Consortium of West Region Colleges of Veterinary Medicine. 2018 Executive Summary. Pullman, WA: Teaching Academy Consortium of the West Region CVM; 2018 [cited 2019 Jul 8]. Available from: https://s3.wp.wsu.edu/uploads/sites/1358/2018/08/RTA_2018_Executive_Summary_final.pdf. Balbach, ED. Using case studies to do program evaluation [Internet]. Sacramento, CA: California Department of Health Services; 1999 Mar [cited 2019 Jul 8]. Available from: https://www.betterevaluation.org/en/resources/guide/using%5fcase%5fstudies%5fprogram%5f evaluation. Yin, RK. Case study research and applications: design and methods. 6th ed. Los Angeles, CA: Sage Publishing, 2018. Creswell, JW.Qualitative inquiry and research design: choosing among five approaches. 3rd ed. Thousand Oaks, CA: Sage Publishing, 2013. Kirkpatrick, DL, Kirkpatrick, JD. Evaluating training programs: the four levels. 3rd ed. San Francisco, CA: Berrett-Koehler, 2006. Thackwray, B.Effective evaluation of training and development in higher education. London, UK: Kogan Page; 1997. Drennan, J, Hyde, A. Controlling response shift bias: the use of the retrospective pre-test design in the evaluation of a master's programme. Assess Eval High Educ. 2008;33(6):699–709. https://doi.org/10.1080/02602930701773026. Mann, KV. Theoretical perspectives in medical education: past experience and future possibilities. Med Educ. 2011;45(1):60–8. https://doi.org/10.1111/j.1365-2923.2010.03757.x. Medline:21155869 Pratt, DD.Good teaching: one size fits all?. New Dir Adult Contin Educ. 2002;2002(93):5–16. https://doi.org/10.1002/ace.45. Pratt, DD, Sadownik, L, Selinger, SJ. Pedagogical BIASes and clinical teaching in medicine. In: English, LM, editor. Adult education and health. Toronto, ON: University of Toronto Press; 2012. p. 193–209. https://doi.org/10.3138/9781442685208. Collins, JB, Pratt, DD. The Teaching Perspectives Inventory at 10 years and 100,000 respondents: reliability and validity of a teacher self-report inventory. Adult Educ Q. 2011;61(4):358–75. https://doi.org/10.1177/0741713610392763. Pratt, DD, Collins, JB, Selinger, SJ. Development and use of the Teaching Perspectives Inventory (TPI). In: Program from Annual Meeting of the American Educational Research Association (AERA). 2001 Apr 10–14; Seattle, WA. Washington, DC: AERA; 2001 [cited 2019 Jul 8]. Wiggins, GP, McTighe, J. What is backwards design? In: McTighe, J, editor. Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development, 1998. Adams, CL, Kurtz, S. Coaching and feedback: enhancing communication teaching and learning in veterinary practice settings. J Vet Med Educ. 2012;39(3):217–28. https://doi.org/10.3138/jvme.0512-038R. Medline:22951457 Cantillon, P, Sargeant, J. Giving feedback in clinical settings. BMJ. 2008;337:a1961. https://doi.org/10.1136/bmj.a1961. Medline:19001006 Barab, SA, MaKainster, J, Scheckler, R.Characterizing system dualities: building online community. In: Barab, SA, Kling, R, Gray, JH, editors. Designing for virtual communities in the service of learning. Cambridge, MA: Cambridge University Press, 2004. Lieberman, A, Mace, DP. Making practice public: teacher learning in the 21st century. J Teach Educ. 2010;61(1-2): 77–88. https://doi.org/10.1177/0022487109347319. Wenger, E, McDermott, RA, Snyder, W.Cultivating communities of practice: a guide to managing knowledge. Boston, MA: Harvard Business School Press, 2002. Boyer, EL.Scholarship reconsidered: priorities of the professoriate. 1st ed. Princeton, NJ: Carnegie Foundation for the Advancement of Teaching; 1990. Fleck, R.Rating reflection on experience: a case study of teachers and tutors reflection around images. Interact Comput. 2012;24(6):439–49. https://doi.org/10.1016/j.intcom.2012.07.003. Moon, JA.Reflection in learning and professional development: theory and practice. London, UK: Kogan Page, 1999. Mann, K, Gordon, J, MacLeod, A. Reflection and reflective practice in health professions education: a systematic review. Adv Health Sci Educ. 2009;14(4):595–621. https://doi.org/10.1007/s10459-007-9090-2. Bolman, LG, Deal, TE. Reframing organizations: artistry, choice, and leadership. 5th ed. San Francisco, CA: Jossey-Bass, 2013. Barab, SA, Barnett, M, Squire, K.Developing an empirical account of a community of practice: characterizing the essential tensions. J Learn Sci. 2002;11(4):489–542. https://doi.org/10.1207/S15327809JLS1104_3. Geldhof, GJ, Warner, DA, Finders, JK, et al. Revisiting the utility of retrospective pre-post designs: the need for mixed-method pilot data. Eval Program Plann. 2018;70:83–9. https://doi.org/10.1016/j.evalprogplan.2018.05.002. Medline:30029016

By Paul N. Gordon-Ross; Suzie J. Kovacs; Rachel L. Halsey; Andrew B. West and Martin H. Smith

Reported by Author; Author; Author; Author; Author

Paul N. Gordon-Ross, DVM MS is Director of Year 4 Curriculum and Associate Professor of Equine and Small Animal General Medicine, Western University of Health Sciences College of Veterinary Medicine, 309 E Second Street, Pomona, CA 91766 USA..

Suzie J. Kovacs, MSc, PhD is Assistant Professor of Epidemiology, Western University of Health Sciences College of Veterinary Medicine, 309 E Second Street, Pomona, CA 91766 USA.

Rachel L. Halsey, DVM, is Academic Coordinator, Washington State University College of Veterinary Medicine, PO Box 647010, Pullman, WA 99164 USA.

Andrew B. West, MEd, PhD, is Director of the Academy for Teaching and Learning, Colorado State University College of Veterinary Medicine and Biomedical Sciences, 1601 Campus Delivery, Fort Collins, CO 80523 USA.

Martin H. Smith, MS, EdD, is Specialist in Cooperative Extension, University of California, Davis, Population Health and Reproduction and Human Ecology, 3213 Vet Med 3B, Davis, CA 95616 USA.

Titel:
Veterinary Educator Teaching and Scholarship (VETS): A Case Study of a Multi-Institutional Faculty Development Program to Advance Teaching and Learning
Autor/in / Beteiligte Person: Gordon-Ross, Paul ; Smith, Martin H. ; Kovacs, Suzie J. ; Halsey, Rachel L. ; West, Andrew B.
Link:
Zeitschrift: Journal of Veterinary Medical Education, Jg. 47 (2020-11-01), S. 632-646
Veröffentlichung: University of Toronto Press Inc. (UTPress), 2020
Medientyp: unknown
ISSN: 1943-7218 (print) ; 0748-321X (print)
DOI: 10.3138/jvme-2019-0089
Schlagwort:
  • Program evaluation
  • Veterinary medicine
  • Faculty, Medical
  • 020205 medical informatics
  • Teaching method
  • Qualitative property
  • 02 engineering and technology
  • Education
  • Community of practice
  • Surveys and Questionnaires
  • ComputingMilieux_COMPUTERSANDEDUCATION
  • 0202 electrical engineering, electronic engineering, information engineering
  • Animals
  • Humans
  • Staff Development
  • Fellowships and Scholarships
  • Program Development
  • General Veterinary
  • Teaching
  • Knowledge level
  • 05 social sciences
  • 050301 education
  • General Medicine
  • Faculty
  • Focus group
  • Scholarship
  • Faculty development
  • Education, Veterinary
  • Psychology
  • 0503 education
  • Program Evaluation
Sonstiges:
  • Nachgewiesen in: OpenAIRE

Klicken Sie ein Format an und speichern Sie dann die Daten oder geben Sie eine Empfänger-Adresse ein und lassen Sie sich per Email zusenden.

oder
oder

Wählen Sie das für Sie passende Zitationsformat und kopieren Sie es dann in die Zwischenablage, lassen es sich per Mail zusenden oder speichern es als PDF-Datei.

oder
oder

Bitte prüfen Sie, ob die Zitation formal korrekt ist, bevor Sie sie in einer Arbeit verwenden. Benutzen Sie gegebenenfalls den "Exportieren"-Dialog, wenn Sie ein Literaturverwaltungsprogramm verwenden und die Zitat-Angaben selbst formatieren wollen.

xs 0 - 576
sm 576 - 768
md 768 - 992
lg 992 - 1200
xl 1200 - 1366
xxl 1366 -