Content expertise in basic science and clinical disciplines does not assure proficiency in teaching. Faculty development to improve teaching and learning is essential for the advancement of veterinary education. The Consortium of West Region Colleges of Veterinary Medicine established the Regional Teaching Academy (RTA) with the focus of "Making Teaching Matter." The objective of the RTA's first effort, the Faculty Development Initiative (FDI), was to develop a multi-institutional faculty development program for veterinary educators to learn about and integrate effective teaching methods. In 2016, the Veterinary Educator Teaching and Scholarship (VETS) program was piloted at Oregon State University's College of Veterinary Medicine. This article uses a case study approach to program evaluation of the VETS program. We describe the VETS program, participants' perceptions, participants' teaching method integration, and lessons learned. A modified Kirkpatrick Model (MKM) was used to categorize program outcomes and impact. Quantitative data are presented as descriptive statistics, and qualitative data are presented as the themes that emerged from participant survey comments and post-program focus groups. Results indicated outcomes and impacts that included participants' perceptions of the program, changes in participant attitude toward teaching and learning, an increase in the knowledge level of participants, self-reported changes in participant behaviors, and changes in practices and structure at the college level. Lessons learned indicate that the following are essential for program success: (
Keywords: faculty development; multi-institutional collaboration; case study
Professional development in teaching and student learning is vital to the advancement of veterinary education.[
Authors in the area of faculty development across the health care professions, such as Behar-Horenstein et al.,[
There is evidence that some faculty perceive that they are unprepared for their teaching role and desire more faculty development in the area of teaching and curriculum.[
In 2011, the deans from five veterinary colleges of the western region of the United States [Colorado State University (CSU), Oregon State University (OSU), University of California, Davis (UCD), Washington State University (WSU), and Western University of Health Sciences (WUHS)], met to identify areas of common interest in which collaboration would allow improved efficiency and effectiveness in addressing institutional needs across the consortium.[
The initial goals of the FDI were "to provide opportunities and resources for training and mentoring of ... faculty with a focus on instructional issues and methods" and to develop a multi-institutional faculty development program to improve teaching.[
Graph: Figure 1: All faculty involved with the teaching of veterinary students, regardless of discipline, department, or setting, were invited to participate. There were 160 respondents to the survey request. Not all participants answered every question.
Based on the needs assessment information, the program was developed to include sessions on understanding yourself as a teacher, understanding your students, course design and development, assessment and evaluation, delivering and receiving feedback, and accessing resources and support. The program goal was to "improve the consistency and quality of student education through the use of evidence-based best educational practices."[
Connections and progression in the development phase of the VETS program was advanced by face-to-face meetings among the FDI members. These meetings occurred approximately every 12 months; that is, at each RTA Biennial Conference and at initiative meetings held in years during which the Biennial Conference was not held. Interim meetings were conducted via Zoom Video Communications, Inc.'s videoconferencing platform (https://zoom.us/). These Zoom videoconference meetings were conducted approximately every 2 to 3 weeks when an upcoming VETS program was to be delivered to every 4 to 8 weeks when no immediate programs were planned. As part of the development, portions of the program were delivered during the 2015 RTA Biennial Conference held at WSU in Pullman, Washington.[
The overarching purpose of this article is a formative evaluation of the pilot delivery of the VETS program at OSU in September 2016. The purpose of this program evaluation was (
- What were the self-reported outcomes and impacts of the pilot VETS program at the individual faculty and institutional levels?
- What were the experiences of the pilot VETS program participants?
The study used a quantitative and qualitative case study design to conduct a formative program evaluation similar to the process described by Balbach.[
The single case study will be used to investigate the research questions through embedded units of analysis. The single case of investigation is the RTA pilot delivery of the VETS program. A case study methodology was chosen to provide a rich understanding of the program from the perspective of the participants and because a case study is the best way to understand what happened during the program.[
The VETS program outcomes and impacts were categorized using a modified Kirkpatrick Model (MKM) of program evaluation. Kirkpatrick's model of program evaluation,[
Table 1: Modified Kirkpatrick's Model (MKM) levels of program outcomes and impacts
Level Change Description MKM 1 REACTION Perceptions of the learning experience, its organization, presentation, content, teaching methods, and quality of instruction MKM 2A LEARNING (change in attitudes) Change in attitudes of participants toward teaching and learning MKM 2B LEARNING (changes in knowledge) Acquisition of knowledge of concepts, procedures, and principles MKM 3A BEHAVIOR (self-reported change in behaviors) Self-reported change in behavior (i.e., application of new knowledge and skills) MKM 3B BEHAVIOR (observed change in behaviors) Observed changes in behavior (i.e., application of new knowledge and skills) MKM 4A RESULTS (change in the system/ organizational practice) Observed changes in the organization attributable to the educational program MKM 4B RESULTS (change among the participants' students, residents, or colleagues) Observed improvement in student or resident learning/performance as a result of the educational intervention
2 * Kirkpatrick's Model (MKM) of training program outcomes and impacts[
The theoretical population and the accessible population were the same and consisted of all attendees of the pilot VETS program (N = 25). Participants were solicited by RTA members at their respective institutions. Early career educators were targeted; however, the final decision on participants was made by each institution's administration. The participants included 4 from WUHS, 5 from WSU, 4 from CSU, 8 from OSU, and 4 from UCD. The sampling strategy for the quantitative portion of the study constituted a convenience sample. All participants were invited to complete surveys prior to, during, and after the program, thus comprising a retrospective pre-test/post-test. The sampling strategy for the qualitative portion of the study was purposive.
The data presented in this article were collected as part of a formative evaluation of the pilot VETS program, and thus constitute previously collected data. The study was reviewed and approved by the Institutional Review Board of Western University of Health Sciences (# X18/IRB/011).
For this case study, a multipronged, mixed-method approach was used and data were collected from multiple embedded instruments, including periodic entrance and exit surveys for which participants recorded information at the request of the presenter, face-to-face focus groups and interviews, a post-test survey, and a retrospective pre-test/post-test program survey. The entrance and exit surveys were designed to gather the participants' self-reported prior and post-session knowledge and understanding of the various topics presented during the program. The post-program survey was delivered using Qualtrics survey software[
Follow-up semi-structured interviews and focus groups were conducted via a Zoom videoconference session 4 months after the conclusion of the program in order to gather additional qualitative data via focus group questions (Appendix 2). These data provided a better understanding of the long-term impacts of the program and allowed for further exploration of the participants' experiences, perceptions, and understandings of the sessions and material.
Quantitative data were analyzed and descriptive statistics were calculated. Inferential statistics were not performed because there was no desire to make inferences about the theoretical population. The qualitative data from the end of the program survey, entrance and exit surveys, and the interview and focus were reviewed, and major themes determined by the consensus of two authors (ABW and MHS).
The primary goal of the FDI was to design a faculty development program for the benefit of educators, with relevance to those early in their career as faculty. The program was designed based on the findings of Steinert et al.,[
The overarching goal of the pilot VETS program was to advance teaching practices to support student learning. To address this, modules were presented to introduce, demonstrate, and model effective methods in teaching for participants in both clinical and pre-clinical curricula. The pilot VETS program learning objectives were as follows:
- Participants will be able to list and identify educational concepts and effective teaching skills for both the clinical and didactic setting.
- Participants will be able to summarize their perspectives on teaching and learning and identify how their perspectives influence their teaching.
- Participants will be able to recognize, classify, and apply diverse methods of instruction.
- Participants will be able to apply the concepts of giving and receiving feedback to improve teaching and learning.
- Participants will be able to understand and apply the concepts of communities of practice to advance their ongoing professional development as an educator.
The 2-day program was delivered at OSU on September 9 and 10, 2016, and was facilitated by 10 members of the FDI using a variety of strategies to model best practices in teaching. All 10 facilitators were experienced educators, with most of them having had advanced training in education ranging from certifications to doctorate degrees in education. Before the program, participants were asked to reflect on their teaching and identify an area of their teaching that they wanted to focus on during the program. Additionally, participants were instructed to outline an educational project that would allow them to apply the concepts they would be learning. Participant projects were to be situated within the context of their current teaching responsibilities to provide a tangible product that could be shared with other faculty members.
During the program, participants were introduced to theories of teaching and learning, perspectives on teaching and learning approaches to instructional design, strategies for assessment, and the importance of using feedback to strengthen instruction; see the Program Agenda: Planned (Appendix 3) and Program Agenda: Delivered (Appendix 4). Sessions were designed to immerse participants in multiple areas of education. Following are brief descriptions of the topics and content introduced during each session.
Participants were immersed in the concepts of constructivism and experiential learning using an active learning process in which attendees engaged in an activity that involved constructing a flying apparatus. Through this exercise, participants were able to actively solve a problem and later reflect on their experience. This session was designed to immerse participants in constructivist principals, a shift that has occurred in medical education as "the focus is not an objective external reality, but more an active responsibility of the learner to construct their own knowledge based on previous experience, perceptions, and knowledge."[
Participants explored their perspectives on teaching and learning using the Teaching Perspectives Inventory (TPI,
Participants were introduced to backward design as a tool to inform course and curricular design. Participants engaged in the alignment of learning goals, objectives, and teaching activities by first focusing on three concrete tasks: (
Participants explored the process and premise behind feedback. This session allowed them to expand their understanding of giving and receiving feedback in veterinary education in the classroom, clinic, and the workplace. Key aspects of effective feedback were presented, including the need for feedback that is timely, specific, and that focuses on actions that can be improved upon.[
Participants were immersed in a community of practice (COP) while participating in the pilot VETS program. For the pilot VETS program, a COP was considered "a persistent, sustained social network of individuals who share and develop an overlapping knowledge base, set of beliefs, values, history and experiences focused on a common practice and/or mutual enterprise."[
Participants were exposed to the process of documenting their growth and achievements in teaching. A representative from the EPRTI presented a set of instruments developed to document aspects of Boyer's scholarship of teaching.[
Participants experienced aspects of reflective practice and its documentation as a form of scholarly teaching (see the preceding subsection, Documenting Your Teaching). Multiple opportunities were provided for discussion about the content of sessions and reflection on how the lessons learned could inform future teaching; this included the processes of reflecting in practice and reflecting on practice.[
To demonstrate reflection in and on practices during the pilot VETS program, the facilitators informed the participants that they would be gathering information and feedback throughout the program and adjusting the itinerary of the remaining sessions based on the needs of the participants. The first reflection occurred during lunch on day one, September 9, 2016, when the facilitators briefly discussed the progression of the program and determined there was no need to alter the current course of action. No changes were made based on this discussion. This same conclusion was made during a second reflection exercise conducted after the first day.
During opening comments on day two, a small group of participants expressed an elevated level of anxiety and frustration. The program facilitators quickly adjusted the opening session for the second day to address the concerns (see Appendix 4). Although not explicitly stated, the consensus of the presenters was that the frustration came from the program's heavy focus on theory rather than on the application of concepts with examples that participants could immediately implement at their home institutions. In response to these concerns, the original session for day two was modified by tying in the previous day's session on writing learning objectives to learning activities and assessment, and using a concrete example to clarify the application of principles of course alignment. The remainder of the program was delivered as planned.
The formative assessment of the pilot delivery of the VETS program focused on multiple aspects of the program, ranging from participants' enjoyment and engagement in the program to impacts on participants' skills, behaviors, and attitudes that were sustained over time. Thus, the data were examined for evidence of changes at multiple levels of the modified Kirkpatrick framework of outcomes and impacts (Table 1).[
After the VETS program, participants were asked to provide feedback on their experiences via the post-program survey. The survey response rate was 88% (22/25). Participants were asked to rate how valuable 12 of the VETS topics (Appendix 1 and Appendix 4) were on a 5-point Likert scale (ranging from 1 = not valuable to 5 = very valuable). The top five topics participants identified as valuable to very valuable were (
Participants reported that the pilot VETS program supported gains in their knowledge of educational concepts and pedagogy consistent with MKM outcome and impact levels 2A and 2B (Table 1). This included advances in their understanding of assessment as a tool for learning, strategies for promoting student-centered learning, writing learning objectives, and a broader awareness of the body of literature supporting best practices in teaching and learning. For example, one participant explained:
During my time at the [pilot VETS program], I began to see that education is an art form and a science. It helped me understand that I needed to better familiarize myself with pedagogy and how to teach. Yes, I have written a teaching philosophy statement, but did not notice at the time that what I really meant was quite generic.
Expanding on the idea of student-centered instruction, another participant stated:
[I gained] exposure to the education research and learned that there are reasons to change practice.
Reflecting on the role of the teacher in the classroom, one participant explained:
[The VETS program] has put [an emphasis on the] concept that I play a big role in student learning. I am realizing that I need to be more flexible as I teach.
These types of shifts in perspective regarding the role the educator plays in student learning were evident across participants with 72.7% (16/22) of the participants reporting their understanding of their teaching perspective as "low" or "none" prior to participating in the program (Figure 2). After the program, participants reported gains in their understanding with 86.4% (19/22) participants describing their understanding of their teaching perspectives as "moderate," "high," or "advanced." Similarly, survey data demonstrated improvement in participants' understandings of assessment as a tool for learning (Figure 3). Twelve of the 22 (54.4%) participants reported their knowledge of assessment before the program as "low," while only 4.5% (1/22) reported their understanding of assessment as "low" after the program. Similar gains in understanding were demonstrated in the areas of student-centered learning, asking questions and using wait time, guided inquiry, and writing learning objectives. These findings are consistent with outcomes and impacts at MKM levels 2A and 2B (Table 1).
Graph: Figure 2: Participants' perceived retrospective pre-/post- level of knowledge in the area of their teaching perspectives
Graph: Figure 3: Participants' perceived retrospective pre-/post- level of knowledge in the area of assessment for learning
Participants reported that the pilot VETS program promoted confidence and empowered them to explore alternatives to didactic teaching. These findings are consistent with outcomes and impacts at MKM level 2A (Table 1). For example, one participant explained that one of the benefits of the pilot VETS program was:
The permission that it gave everyone to feel more [open] to experiment, and frankly, to fail. We try to encourage students to include failure, and we need to do the same thing as instructors. It might fall flat, but that is okay.
Another participant reported:
I took a risk—regardless of the format of the evaluation, if you don't do the follow-up and discuss it at great length, you lose a lot of learning. So, I flipped the exam, so they were taking it as a take-home and then at the scheduled exam time they went over the exam ... [S]tudent feedback was largely positive.
Participants explained that they were motivated to reduce content and add alternative teaching methods, to incorporate "high risk" teaching strategies because they are the "things that students have loved," and that they have become more flexible in giving students "the option to determine what they want to do." These types of approaches resulted in "all kinds of interesting talks and topics."
Evidence for participant confidence and empowerment was apparent from participant self-reports of their intent to integrate a wide variety of the topics, concepts, and strategies explored during the VETS program into their teaching (Table 2).
Table 2: Likelihood of integrating teaching strategies by participants
Topic Percent Alignment between goals, instruction, and assessments 86% Wait time/asking questions 86% Writing learning objectives 82% Reflecting on your teaching/student feedback 82% Assessment for learning 82% Communities of practice 82% Getting and giving feedback 77% Student-centered learning 77% External review of teaching/writing a teaching vitae 73% Teaching perspectives 55% Learning cycle 55% Guided inquiry/funneling 50%
3 * Indicates the percentage of responses indicating the inclusion of the topic as 4 = likely or 5 = very likely on a Likert scale (ranging 1 to 5).
Participant confidence and empowerment to explore alternatives to didactic teaching resulted in concrete changes to actual classroom, clinic, and laboratory teaching. These findings are indicated by self-reported changes in behavior (MKM level 3A, Table 1).
For example, one participant integrated student-driven/flipped classroom teaching methods into a large existing basic science lecture course. After reviewing the pertinent details of a case, students were asked to work in groups to choose one or two antimicrobials that "would be sensible for use" and be prepared to defend their choices. Students then shared their ideas via clicker-type software and their ideas were used to explore and expand on the subject during the lecture portion of the class.
A second participant rewrote her course objectives and aligned all her assessment items to the new objectives. During this process, she reconsidered each objective in terms of (
A third participant revised the entire structure of a rounds-type course to address differences in student background and expertise, as a result of having both clinical neurology and anatomic pathology residents in the same course. The course was modified so that instead of interpreting slides in front of the entire class, residents could review slides at a multi-headed microscope in a low-stress, casual environment. In this manner, the basic aspects of lesions could be discussed without (perceived) fear of judgment from anatomic pathology residents. The participant stated:
This has been immensely successful, as the feedback I have received from residents has been uniformly positive, and has even stimulated regular attendance by neurology faculty, which was rare previously.
Other concrete actions taken by participants included revising syllabi, writing teaching philosophies, and redesigning course materials.
Participants reported that the pilot VETS program supported their understanding of the concept of COPs, with nearly all participants reporting limited understanding at the start of the program and moderate-to-high understanding after the program (Figure 4). This indicates outcomes and impacts at MKM level 2A (Table 1).
Graph: Figure 4: Participants' perceived retrospective pre-/post- level of knowledge in the area of communities of practice
During the program, participants described that working with educators from multiple institutions was a highlight of their experience. Participants felt that these interactions supported them in expanding on and exploring ideas and concepts developed from the program. They also reported relief in feeling that they were not alone in the challenges they face at their own institutions. One participant explained that she benefited from "talking to other groups about what they are teaching, sharing of ideas, and realizing that we have the same problems."
Participants also described the benefits of the COPs that formed at their home institutions following the VETS program. Although participants found it difficult to meet frequently with their entire COP, they felt that sharing ideas and updating their colleagues on changes that they were making to their classes was beneficial. At one institution, participants developed a COP focused on issues related to teaching first-year veterinary students. The invitation to participate was extended to all faculty teaching within the first year, with the goal being to "support each other when difficulties arise, to re-engage the passion of faculty, and let us get to know each other." These findings are consistent with the changes at the college level with the formation of a COP, at least at one institution (MKM level 4A, Table 1).
Follow-up focus group interviews with participants were conducted via Zoom videoconference in November and December 2016, and they revealed that many participants incorporated VETS program topics into their teaching. Concepts incorporated included (
Many participants described how the pilot VETS program motivated them to try innovative approaches in their classrooms:
[The program] sparked the desire that I want to keep going, I want to get better.
I feel empowered and motivated to help students.
There is a science to educating people.
I was struck with the concept that interaction with students is better than just talking at them.
Just because I have a specific lecturing style that I prefer does not mean that I need to stay loyal to that—especially with active learning.
The benefits of interacting with peers from across institutions were acknowledged by most participants and many indicated that the program resulted in:
... the opportunity to break into small groups and to hear their opinions, challenges, and solutions.
... a feeling of solidarity. I'm not in this all by myself; I'm not the only one with these concerns.
Designing and implementing a faculty development program is a high-effort task, one for which there is a high need in veterinary education.[
Financial and administrative support is critical for the success of such an endeavor. The partnership with Zoetis, in addition to the financial support of the member institutions, was crucial in ensuring the FDI was provided the time and resources to develop the VETS program. One of the greatest challenges for the FDI members, as for all members of the RTA, is the ability to manage the time commitments of the member's primary position duties at their respective institutions and the time commitment to be a productive member of the RTA. At this time, there is no standardized faculty full-time equivalent (FTE) allotted for RTA activities. This produces a tension between time commitments for individual faculty and highlights the need for administrators at RTA member institutions to create structural changes in the reward systems at their institution. As indicated by Bolman and Deal,[
A multi-institutional working partnership, such as the FDI, is a difficult endeavor to create and sustain. One of the significant barriers to the success of such an endeavor is the distance between the members of the FDI members. The five original member institutions—CSU, OSU, WSU, UCD, and WUHS—are spread across four different states, with the closest being approximately 424 miles apart (UCD and WUHS) and the farthest, 1,109 miles apart (WSU and WUHS). The group found it challenging to find time to meet and prioritize the work. In the program development phase, it was found that face-to-face meetings were essential to creating and sustaining a cohesive group of faculty members. With videoconferencing solutions currently available, virtual meetings are easy to organize and conduct, but the FDI members found that without the firm sociocultural foundation established in face-to-face meetings, virtual meetings were less effective in fostering the needed connections and, therefore, less productive. Face-to-face meetings were essential to providing dedicated time for members to focus on program development. In the end, the FDI has been successful in creating a COP across the five institutions, "a persistent, sustained social network of individuals who share and develop an overlapping knowledge base, set of beliefs, values, history, and experiences focused on a common practice and/or mutual enterprise."[
Developing a program to meet the needs of faculty from the five institutions is also key to the success of the program.[
Results of this formative program evaluation are impacted by the number of participants and how they were selected. The relatively small number of participants may limit the strength of the results; however, the goal was not to generalize the results to the theoretical population, but rather use them to inform program improvement. The non-random, purposive selection of participants from each institution may have biased the results of the evaluation. While it may not be feasible to deliver the program using a random sample of participants, as the program grows it will be beneficial to continue to re-evaluate for program improvement. While use of a retrospective pre-test/post-test has its own limitations (e.g., recall bias, socially desirable responses), this self-report survey method is an efficient and inexpensive way to measure effectiveness.[
In considering the purpose of this study, this formative program evaluation provided information about the lived experiences of the participants and the program's outcomes and impacts at the individual, department, and institutional level. Additionally, information was gathered that provided formative feedback to inform continued programmatic improvement. Despite the challenges discussed, this article provides the support that multi-institutional faculty development is not only feasible but effective in promoting changes at the college and institutional level. Essential items for the success of such an initiative include financial support and structural changes in the rewards and incentive systems at individual institutions to ensure faculty are supported and rewarded. Additionally, success is fostered by the organic development of a COP of faculty development facilitators. Finally, the faculty development program must meet the real needs as well as the perceived needs of participants.
The VETS program was the result of a strategic initiative of the Consortium of West Region Colleges of Veterinary Medicine: Colorado State University (CSU), Oregon State University (OSU), University of California, Davis (UCD), Washington State University (WSU), and Western University of Health Sciences (WUHS). Members of the Faculty Development Initiative contributed to the development and delivery of the VETS program. Those who contributed to the article are listed as authors. Members during the time when the pilot VETS program was delivered and data were collected include Karyn Bird (OSU) Beth Boynton (WUHS), Bonnie Campbell (WSU), Julie Cary (WSU), Munashe Chigerwe (UCD), Kristy Dowers (CSU), Laurie Fonken (CSU), Samantha Gizerian (WSU), Paul Gordon-Ross (WUHS), Suzie Kovacs (WUHS), Steve Lampa (WSU), Ohad Levi (WUHS), Peggy Schmidt (WUHS), Martin Smith (UCD), Jane Shaw (CSU), and Andrew West (CSU). Rachel Halsey (WSU) provided administrative support. Corporate partner Zoetis contributed to funding the VETS program.
Teaching Academy of the Consortium of West Regions Colleges of Veterinary Medicine 2016 Pilot VETS Program Post-Program Survey
Question 1 – A number of topics have been discussed during the VETS Program. Please indicate how valuable you felt each of the topics was to you as a teacher on a scale of 1–5 (1 = not valuable, 5 = very valuable).
Table A1: 2016 pilot VETS Program post-program survey question 1, "Please indicate how valuable you felt each of the topics was to you as a teacher on a scale of 1–5 (1 = not valuable, 5 = very valuable)."
1 not valuable 2 3 4 5 very valuable Assessment for learning m m m m m Teaching perspectives m m m m m Student-centered learning m m m m m Wait time/asking questions m m m m m Guided inquiry/funneling m m m m m Reflecting on your teaching/student feedback m m m m m Learning cycle m m m m m Alignment between goals, instruction, and assessments m m m m m Writing learning objectives m m m m m Community of practice m m m m m Getting and giving feedback m m m m m External review of teaching/writing a teaching vitae m m m m m
Question 2 – Please indicate how likely you are to integrate ideas from the following topics into your teaching this academic year (1 = not likely, 5 = very likely).
Table A2: 2016 pilot VETS Program post-program survey question 2, "Please indicate how likely you are to integrate ideas from the following topics into your teaching this academic year (1 = not likely, 5 = very likely)."
1 not likely 2 3 4 5 very likely Assessment for learning m m m m m Teaching perspectives m m m m m Student-centered learning m m m m m Wait time/asking questions m m m m m Guided inquiry/funneling m m m m m Reflecting on your teaching/student feedback m m m m m Learning cycle m m m m m Alignment between goals, instruction, and assessments m m m m m Writing learning objectives m m m m m Community of practice m m m m m Getting and giving feedback m m m m m External review of teaching/writing a teaching vitae m m m m m
Question 3 – We recognize that for many of the topics that were discussed during the program, you likely had some level of understanding about the topic prior to the session. We hope that after the session, your level of understanding was strengthened in some way. Please help us understand how your experience in the VETS Program may or may not have impacted your understanding by circling responses below for each topic.
Table A3: 2016 pilot VETS Program post-program survey question 3, "Please help us understand how your experience in the VETS Program may or may not have impacted your understanding by circling responses below for each topic."
Understanding BEFORE discussion Understanding AFTER discussion None Low Moderate High Advanced None Low Moderate High Advanced Assessment for learning m m m m m m m m m m Teaching perspectives m m m m m m m m m m Student-centered learning m m m m m m m m m m Wait time/asking questions m m m m m m m m m m Guided inquiry/funneling m m m m m m m m m m Reflecting on your teaching/student feedback m m m m m m m m m m Learning cycle m m m m m m m m m m Alignment between goals, instruction, and assessments m m m m m m m m m m Writing learning objectives m m m m m m m m m m Community of practice m m m m m m m m m m Getting and giving feedback m m m m m m m m m m External review of teaching/writing a teaching vitae m m m m m m m m m m
Question 4 – Please provide any ideas you have for how the VETS Program might be able to support you in your teaching during this academic year?
Question 5 – During the VETS Program, faculty presenters conducted sessions using a number of different strategies, activities, formats, etc. Please let us know which approaches, if any, you found to be the most helpful.
Question 6 – Please list any topics that were not discussed during the VETS Program that you would have liked us to address?
Question 7 – Identify your breakout group in which you participated.
□ Curriculum Development □ Clinical/Preceptorship
Question 8 – Describe what benefits, if any, you derived from your participation in this group.
Question 9 – Additional Comments:
- What are some of the key takeaways from the VETS program?
- Are there some concrete outcomes or philosophical shifts that resulted from your participation in the VETS program?
- Looking back, what were the most helpful components of the VETS program?
- Can you talk about some of the values of the Community of Practice at your home institution?
- Are there any other impacts that the VETS program had on your teaching this semester?
- Learning Theories – Constructivism and Experiential Learning
- Teaching Perspectives Inventory – Strength Training Discussion (pre-workshop homework)
- Understanding Yourself as a Teacher – Reflective Practice, Inquiry and Experiential Learning, and Pedagogical Approach
- Understanding Your Students – Generations, Student Development, and Learning Styles
- Module 1 Wrap-up – Outcomes for the session
- 11:30 AM Application Activity – Project Work
- Learning Objectives to Instructional Methods – Writing Learning Objectives
- Learning Opportunities/Instructional Design – Design/Alignment, Objectives, Technology, Other Resources, Working Efforts on Projects (small group work), and Strength Training Discussion (pre-workshop homework)
- Module 2 Wrap-up – Outcomes for this session
• 5:00 PM Social gathering
- Assessment: Beyond Multiple Choice – Theory and Methods, Biases, Strength and Weakness of Assessments Forms, Assessment Technology
- 11:00 AM Delivering and receiving feedback – Feedback from students
• 5:00 PM END OF SESSIONS
- Learning Theories – Constructivism and Experiential Learning
- Teaching Perspectives Inventory – Strength Training Discussion (pre-workshop homework)
- Understanding Yourself as a Teacher – Reflective Practice, Inquiry and Experiential Learning, and Pedagogical Approach
- Understanding Your Students – Generations, Student Development, and Learning Styles
- Module 1 Wrap-up – Outcomes for the session
- 11:30 AM Application Activity – Project Work
- Learning Objectives to Instructional Methods – Writing Learning Objectives
- Learning Opportunities/Instructional Design – Design/Alignment, Objectives, Technology, Other Resources, Working Efforts on Projects (small group work), and Strength Training Discussion (pre-workshop homework)
- Module 2 Wrap-up – Outcomes for this session
• 5:00 PM Social gathering
- Backwards Design – Design/Alignment, Outcomes, Objectives, and Activities
- Assessment: Beyond Multiple Choice – Theory and Methods, Biases, Strength and Weakness of Assessments Forms, Assessment Technology
- Delivering and Receiving Feedback – Feedback from students and to students
• 5:00 PM END OF SESSIONS
The authors have no conflicts of interest to declare.
By Paul N. Gordon-Ross; Suzie J. Kovacs; Rachel L. Halsey; Andrew B. West and Martin H. Smith
Reported by Author; Author; Author; Author; Author
Paul N. Gordon-Ross, DVM MS is Director of Year 4 Curriculum and Associate Professor of Equine and Small Animal General Medicine, Western University of Health Sciences College of Veterinary Medicine, 309 E Second Street, Pomona, CA 91766 USA..
Suzie J. Kovacs, MSc, PhD is Assistant Professor of Epidemiology, Western University of Health Sciences College of Veterinary Medicine, 309 E Second Street, Pomona, CA 91766 USA.
Rachel L. Halsey, DVM, is Academic Coordinator, Washington State University College of Veterinary Medicine, PO Box 647010, Pullman, WA 99164 USA.
Andrew B. West, MEd, PhD, is Director of the Academy for Teaching and Learning, Colorado State University College of Veterinary Medicine and Biomedical Sciences, 1601 Campus Delivery, Fort Collins, CO 80523 USA.
Martin H. Smith, MS, EdD, is Specialist in Cooperative Extension, University of California, Davis, Population Health and Reproduction and Human Ecology, 3213 Vet Med 3B, Davis, CA 95616 USA.