Zum Hauptinhalt springen

Lecture Capture Technology and Student Performance in an Operations Management Course

Sloan, Thomas W. ; Lewis, David
In: Decision Sciences Journal of Innovative Education, Jg. 12 (2014-10-01), S. 339-355
Online unknown

Lecture Capture Technology and Student Performance in an Operations Management Course. 

Lecture capture technologies (LCT) such as Echo360, Mediasite, and Tegrity have become very popular in recent years. Many studies have shown that students favor the use of such technology, but relatively little research has studied the impact of LCT on learning. This article examines two research questions: (1) whether the use of LCT actually enhances learning outcomes, and (2) how instructors can increase students’ use of lecture capture materials. We address these questions using data from an undergraduate Operations Management course at a mid‐size, public university. Results indicate that access of lecture capture videos is associated with higher exam scores, even after controlling for previous exam performance. In addition, efforts to promote the use of lecture capture materials increase their use. We offer several suggestions to instructors who wish to increase their students’ use of lecture capture materials, which are equally applicable across all disciplines.

Lecture Capture; Instructional Technology; Student Performance

Advances in technology and changes in student demographics have led to significant changes in the college classroom experience over the last decade. Once dominated by lectures, chalkboards, and overhead transparencies, today's college classes have been transformed into rich multimedia experiences involving clickers, web sites, podcasts, and YouTube videos. One significant development in the last few years is the emergence of lecture capture technology (LCT). Platforms such as Accordent, Echo360, Mediasite, Panopto, and Tegrity capture everything that happens during a class session—video, audio, slides, virtual whiteboard, etc.—and make the materials available for remote use, usually via a web connection. While the use of such technology has increased dramatically in recent years, its impact on student learning is not yet fully understood. This article addresses two questions related to the use of lecture capture technology. First, is the use of LCT associated with improved learning outcomes? Second, assuming that it has a positive impact, how can instructors encourage their students to make more and better use of LCT?

Our university began using Echo360 lecture capture technology in 2010 and has installed the necessary hardware and software in many classrooms across the campus. A small but growing number of faculty and students have embraced the technology, while others remain skeptical. This skepticism mirrors that of using the Internet as an alternative delivery medium in the late 1990's. Concerns about the technology range from its impact on student class attendance to the ownership of course content. After using Echo360 for about 1 year, opinions of the technology were generally favorable, but there was also a sense that it was not being fully utilized by students. As a result, we were motivated to collect data related to student LCT usage and to explore the relationship between usage and learning outcomes.

While many papers (some of which are discussed below) discuss the pros and cons of LCT or describe particular case studies involving LCT, the goal of this study is to contribute to the understanding of LCT use and best practices using a data‐driven approach.

PREVIOUS RESEARCH

Many researchers have explored the use of new instructional technologies such as podcasts, clickers, and microblogs, to name only a few. For general discussions of technology in the classroom, refer to the recent reviews by Kirkwood and Price (2014) and Fu ([12] ). The focus here is on research related specifically to lecture capture technology. Why has the use of lecture capture technology increased so dramatically in recent years? Newton, Tucker, Dawson, and Currie ([22] ) suggest that the increasing use of LCT is driven by at least two factors. First, significant improvements in technology—both in terms of hardware and software—has made lecture capture much easier. Second, today's students are “Digital Natives” who expect on‐demand access to “content,” whether that be YouTube videos, sports scores, or a lecture on biology. Dey et al. ([6] ) echo the latter point, observing that students are technology‐savvy and are “ready to engage information in new ways.” Similarly, a survey by Folley ([10] ) indicates that students increasingly expect technological additions (such as lecture captures) to enhance the learning process. Preston et al. ([25] ) also argue that students need and want more convenience and flexibility with respect to course content access. These factors have combined to increase the availability and popularity of LCT in higher education.

The research related to LCT can be divided into four categories. The first category addresses the “how‐to” of lecture capture, for example, ways that LCT can and should be used. Such research includes discussions of best practices (Zhu & Bergom [36] ) and suggestions for use in particular disciplines (DeSantis, Pantalone, & Wiseman, [5] ). It also includes descriptions of innovative uses of LCT. For example, Smith and Sodano ([28] ) described how lecture capture can be used to help students critique and improve their presentations skills. Two groups of undergraduate students made presentations and then analyzed their performance. One group was able to review performance using lecture capture technology. The students in this group, though no more confident than students in the other group at the outset, were more likely to make improvements based on the self‐assessment.

The second, and by far largest, category of research includes papers that report on early experiences with lecture capture in particular institutions or programs. These case studies often include assessments of student and faculty perceptions (e.g., do they use it and like it?) and the impact of LCT on class attendance. Karnad ([15] ) reviewed and summarized many papers in this category. We highlight a few examples to provide a sense of the variety of programs making use of lecture capture. Veeramani and Bradley ([34] ) reported on undergraduate and graduate student attitudes regarding lecture capture at a large, public university. Undergraduates reported overwhelming support of LCT as an adjunct to traditional face‐to‐face classes, and even indicated a willingness to pay extra for such a service. Rogers and Cordell ([26] ) surveyed graduate and undergraduate students in both online and on‐campus courses regarding their views of a lecture capture system. Students reported high levels of satisfaction and better understanding of the material; faculty reported fewer e‐mails and office visits. Pons, Walker, Hollis, and Thomas ([24] ) reported on a pilot study of lecture capture technology, and “students reported that the system enhanced their engagement.”

Davis, Connolly, and Linfield ([4] ) surveyed undergraduate engineering students’ attitudes and opinions regarding a newly installed lecture capture system. Results indicated that once students became aware of lecture capture availability, they were enthusiastic about it. Most students used LCT for reviewing lectures, and claimed that lecture capture availability did not affect their attendance. Similarly, Nashash and Gunn ([21] ) surveyed engineering students regarding their use of LCT. Students reported high levels of satisfaction with lecture capture and that it did not affect their class attendance. Dibacco, Hetherington, and Putman ([7] ) evaluated the use of lecture capture technology in a podiatric medicine education context. Results of both faculty and student surveys indicated high levels of use and satisfaction with the system. However, there were still concerns about the potential effect on class attendance. Cooke et al. ([3] ) examined the use of lecture capture in the first year of a nursing program. The study indicated that most students found live lectures preferable to recorded ones. The students also found the technology to be helpful in reviewing course content. Tan, Wong, and Kwong ([30] ) reported the results of a pilot study of LCT use at their university. In addition to discussing planning and deployment, the authors also reported user perceptions about the new technology.

Toppin ([31] ) described how lecture capture technology was implemented at one university to help with retention problems and high rates of failure. The results showed that students strongly favored LCT and found it to be helpful. According to self‐reported data, availability of lecture capture videos did not affect student attendance. Faculty surveyed as part of the project also had favorable responses regarding the system. While this type of research is valuable and practical—especially for those considering the use of LCT—there is a heavy reliance on student self‐reported data. In addition, little consideration is given to the impact on learning outcomes.

The third category of research relates to students’ learning styles. Few papers in this category explicitly examine connections between what learning theories predict and what students actually do. A notable exception is the work by Dey et al. ([6] ), who explored how Mayer's theory of multimedia learning (Mayer, [18] , [19] ) can be applied in the context of lecture capture. To accomplish this goal, undergraduate students in a physics course viewed one of three versions of a lecture: a live lecture, a recorded lecture with audio and slides, or a recorded lecture with audio, slides, and video of the instructor. Students were then asked a number of questions about the lecture, including questions related to the retention of material and a question asking them to apply the lesson to a different situation. While there were no significant differences with respect to the retention questions, students who viewed the lecture capture performed significantly better on the transfer question. The authors suggest that this result may be due to the fact that students in the live lecture focus more on the instructor than on the content of the slides and thus are not able to apply the material to a different context as readily. While this study produced important insights, it should be noted that it included only a single 20‐minute presentation and did not examine performance over a longer period of time.

Vajoczki, Watt, Marquis, Vine, and Liao ([33] ) examined how learning styles can affect the use of and effectiveness of LCT. The authors found that deep learners, as compared to surface learners, tend to review lecture captures on a more regular basis and also use them for exam review. Surface learners, in contrast, use LCT in place of class attendance. Not surprisingly, deep learners achieved higher academic performance.

Other papers in this third category touch on teaching and learning styles indirectly by focusing student usage patterns. Based on a large survey of Australian students, Preston et al. ([25] ) concluded that LCT is most effective when delivering traditional, one‐way lectures in large classes, and that LCT is less effective in small classes and classes that rely on a high degree of interaction (e.g., problem solving). Drouin ([8] ) examined two sections of an undergraduate psychology course; one section was given access to lecture capture materials, while the other was not. Results indicated that the section with access to lecture recordings had significantly lower attendance rates and course achievement (final grades). Further analysis, however, indicated that LCT availability reduced participation (in exams, class activities, and assignments) in only a subset of the students. When these “nonparticipators” were excluded from analyses, the differences between class sections were no longer significant. Gorissen, van Bruggen, and Jochems ([13] ) examined usage patterns of a large population (517) of undergraduate students enrolled in several technical courses (e.g., calculus, chemical biology, etc.). The results—based on student self‐reported data—indicated that students used LCT to review for exams and to make up for missed classes. Students also reported that when they viewed a previously recorded class, they watched 75% to 100% of it. Lectures delivered using traditional chalkboards were viewed less than lectures using PowerPoint. Collectively, these results provide insights about how LCT can better serve students with different learning styles. However, more research in this area is clearly needed.

The fourth category of research attempts to link LCT use to objective measures of learning. Some studies focus on the differences between traditional face‐to‐face lectures and other delivery methods such as viewing previously recorded lectures. For example, Euzent, Martin, Moskal, and Moskal ([9] ) compared student performance and satisfaction in two sections of an undergraduate course, one delivered using traditional face‐to‐face means, and the other delivered asynchronously using lecture capture. Student performance in the two sections was similar, and students in the section delivered using LCT reported high levels of satisfaction. However, the withdrawal rate was higher for the LCT section, indicating that course delivery via lecture capture may not be suitable or desirable for all students. Missildine, Fountain, Summers, and Gosselin ([20] ) compared three delivery modes to different groups of nursing students. One group used the traditional lecture approach, one group used face‐to‐face lectures and also had lecture capture available, and one group used a “flipped classroom” approach. In the flipped classroom approach, students were responsible for viewing lectures outside of class, and class time was spent engaging in hands‐on and other innovative classroom activities. While exam performance for the flipped classroom group was the highest, students in this group were less satisfied than those in the other groups.

Other studies in this category explore how the availability of LCT as a supplement to traditional classes can affect performance. Owston, Lupshenyuk, and Wideman ([23] ) examined the use of lecture capture in large undergraduate classes. They linked self‐reported student data about class attendance and lecture viewing to students’ final course grades. Results indicated that students with higher grades viewed the lecture capture videos less often than students with lower grades. In addition, students with higher grades tended not to view the entire lecture; rather, they fast forwarded and viewed particular sections. The authors concluded that lecture capture may therefore be of greater benefit to low‐achieving students.

Ford, Burns, Mitch, and Gomez ([11] ) examined the impact of lecture capture technology use by undergraduate students. Student self‐reported study habits, perceptions of the course, and course grades were examined for four sections of an undergraduate psychology class. All sections were taught in a traditional, face‐to‐face format. However, two sections occurred before the LCT system was deployed, and two sections occurred after. The authors controlled for academic ability by including prior GPA in the analyses. They found that students given the option of using lecture capture reported more study time and higher satisfaction with the course; however, overall course grades were not affected significantly.

Stroup, Pickard, and Kahler ([29] ) studied the link between lecture capture availability and course grades, using prior GPA as a control variable. Course grades were compared for two sections of two courses; one section of each course had lecture capture available, while the other section did not. In an introductory computer science course, the availability of lecture capture did not have a significant impact on course grades. In an introductory economics course, however, low‐GPA students in the section with lecture capture available earned lower grades than their low‐GPA counterparts in the section without lecture capture. This pattern was not observed for high‐GPA students. A possible conclusion is that low‐GPA students were given a false sense of comfort by the availability of lecture capture, which made them more likely to miss class.

While the three studies reviewed above offer valuable insights, their reliance on student self‐reported data may cast some doubt on the results. To address this issue, some researchers have devised ways to measure student access to LCT in more objective ways. For example, von Konsky, Ivins, and Gribble ([35] ) studied the impact of lecture capture on class attendance and final grades for undergraduate students in a software engineering course. Rather than relying on self‐reported data, researchers were able to collect objective data on attendance and lecture capture “hits.” The study found that attendance was not significantly affected by the availability of lecture capture. Students with high course grades used LCT as a supplement to lectures (rather than as a replacement); however, the usage varied substantially. The authors found no direct link between LCT usage and final course grades.

Bollmeier, Wenger, and Forinash ([1] ) studied the use of lecture capture and student performance in a therapeutics course. Their study found no correlation between final course grades and the number of times students accessed lectures. Students reported that availability of lecture capture videos did not affect their class attendance. However, student performance on the final exam was found to be higher for this group than for a historical control group.

Shaw and Molnar ([27] ) examined the impact of lecture capture in a medical school course. They compared the performance of a class given access to lecture capture to that of the previous year's class which did not have lecture capture access. The study found that test scores and overall course performance of students given access to lecture captures was significantly better than that of the control group. Interestingly, the results were more pronounced for nonnative English speakers, suggesting that they can benefit more from the review capabilities afforded by lecture capture. Johnston, Massa, and Burne ([14] ) reported on the use of LCT in a nursing education program. Two groups of students were examined; both groups were taught in traditional, face‐to‐face classrooms, but one group was given access to course content via LCT. Those in the LCT group reported high satisfaction with the technology. However, academic performance was statistically significantly lower for this group as compared to the group for which LCT was not made available.

In summary, there has been a significant amount of research on lecture capture technology in recent years. Some effects of LCT have been undoubtedly positive, however, negative consequences have also been observed. Tables [NaN] and [NaN] list some of the advantages and disadvantages, respectively, of the use of LCT. While student satisfaction with LCT is generally high, an important question that remains unresolved is how does LCT use relate to learning outcomes? Although several studies have examined this question, the results have been varied. The mixed results may be related to factors including discipline (e.g., engineering vs. psychology), type of lecture capture (e.g., including video of instructor or not), level of analysis (e.g., exam grade vs. course grade), level of student (e.g., undergraduate vs. graduate), the nature of the material and delivery (e.g., one‐way transfer of information vs. interactive problem solving), or whether the lecture capture was a supplement to or replacement for lectures. Determining the precise causal factors from such a diverse set of studies is beyond the scope of this article. Our study attempts to contribute to the understanding of this topic by exploring the connections between lecture capture use and student performance in a business course.

Some advantages of lecture capture

AdvantageReferences
Increases student satisfactionToppin (31), Nashash and Gunn (21)
Students like and expect new technologyFolley (10)
Facilitates review of material, especially in preparation for examsVajoczki et al. (33), Cooke et al. (3), Dibacco et al. (7)
Students report higher engagement and improved learningPreston et al. (25), Pons et al. (24), Drouin (8)
Increases accessibility, e.g., for those with disabilities or nonnative English speakersVajoczki et al. (33), Shaw and Molnar (27)
Reduces routine informational inquiries to instructorBollmeier et al. (1), Rogers and Cordell (26)
Provides flexibility for students who must miss class (e.g., student athletes)DeSantis et al. (5)
Allows students to review their own in‐class presentationsSmith and Sodano (28)

Some disadvantages of lecture capture

DisadvantageReferences
May reduce class attendancePreston et al. (25), Vajoczki et al. (33), Drouin (8)
May enable nonparticipationStroup et al. (29), Drouin (8)
Potential hardware and software costs to institution“Lecture Capture Yet to Take Hold” (2011), Newton et al. (22)
Learning curve for instructorsDavis et al. (4), “Lecture Capture Yet to Take Hold” (17), Newton et al. (22)
Uncertainty about intellectual property rights (i.e., who “owns” the lecture content?).“Lecture Capture Yet to Take Hold” (2011), Newton et al. (22)

RESEARCH DESIGN OVERVIEW

The use of lecture capture materials and student performance was investigated in two sections of an undergraduate Operations Management course at a mid‐size, public university. Both sections had the same instructor, and there were 35 students in each section (for a total of 70 students). The Operations Management course is required for all business majors, and students typically take it in the third or fourth year of the program. The course is divided into three modules, each covering three chapters from the textbook. An exam is given after each module, and the modules are not cumulative. Toward the beginning of the semester, the instructor informed students that all lectures were being recorded and demonstrated how the lectures could be accessed. To get a sense of students’ baseline behavior, access to lecture capture materials was examined beginning after the first exam. This is typically the time of the semester when there is renewed interest in academic performance (i.e., students are beginning to express concern about their grades).

Once scheduled by the Instructional Technology department, each class session was captured automatically and uploaded to a server. Links to the sessions were placed in a specific area of the course web site, which uses the Blackboard learning management system. In addition, lecture notes were generated for each lecture. The notes contained all of the material written on a virtual white board during class sessions, including announcements and example problems solved in class. The notes were manually uploaded to the course web site. Blackboard is robust in that it can track many data points, including how many times a class resource is accessed and for how long.

As mentioned previously, two central questions motivated this project:

Does the use of lecture capture materials improve students’ learning?

Assuming that the answer to the above question is “yes,” how can instructors increase the use of lecture capture materials by students?

To address the first question, data were collected regarding students’ use of lecture capture videos and their performance on exams. Since links to the lecture capture videos were located in a specific area of Blackboard, the number of times each student accessed the lecture capture folder and clicked on one of the links was recorded, and the association between students’ access patterns and exam scores examined.

To address the second question, different methods were used to raise students’ awareness of lecture capture materials. The first method required students to watch a previously‐recorded lecture capture video in lieu of coming to class. The second method was the use of extra credit quizzes based on particular lectures. Three quizzes were given between the second and third exams. The first quiz was based on the required lecture capture viewing. Each quiz had three questions each worth 1 point on the third exam. Thus, a student could earn up to 9 extra credit points on the third exam (which would raise their final course grade by nearly 2 points out of 100). Quizzes were administered online and made available for approximately 30 hours (each). Once begun, students were allowed 9 minutes to complete each quiz.

The third method to increase the use of lecture capture materials was promotion by the instructor. One of the two sections was designated as the “strong promotion” section. In this section, the instructor reminded the class two or three times about the extra credit quizzes, and emphasized the connection between specific lecture capture videos and quiz content. The other section was designated as the “weak promotion” section. In this section, the instructor reminded students once about quizzes, but did not tell them about the explicit connection with lecture captures. An important part of “promotion” is thus not only the number of reminders but the emphasis on the linkage between quizzes and specific parts of the captured lectures.

The following hypotheses regarding the use of lecture capture materials are proposed

H1. More lecture capture access will be associated with better learning outcomes.

H2. Efforts to promote lecture capture use are positively associated with student usage.

Below we report how data were collected and analyzed to test these hypotheses.

DATA DETAILS AND ANALYSIS

The first question considered is how use of lecture capture materials affects learning outcomes. Using linear regression, the association between lecture capture video access between Exam 1 and Exam 2 (independent variable) and Exam 2 performance (dependent variable) was examined. One could argue that “good” students may be inclined to make better use of class resources than “poor” students, thus a connection between lecture capture video views and exam performance is not causal. One could also argue that students who do not do well on the first exam will work harder on subsequent exams. To control for these effects, performance on the previous exam was included as an independent variable. The section—strong or weak promotion—was also included as an independent variable. The model is significant, with p <.001, and a value of multiple R2 of.5854. Standard diagnostic tests for autocorrelation and multicollinearity indicated that neither was present in the model. Table [NaN] reports the coefficients and significance for each of the variables.

Regression results for Exam 2 model

VariableCoefficientStd. Errort‐Statisticp Value
Intercept25.995.924.394.18e‐050001
Exam 1.65.078.761.17e‐120001
Lecture views.30.132.24.0280001
Section weak4.311.882.29.0250001

  • 5 Note
  • 6 ***p <.001, **p <.01, *p <.05, +p <.1, ns p ≥.1.

These results indicate that, as expected, Exam 1 is a good predictor of performance on Exam 2. However, the number of lecture views is also statistically significant (α =.05). Although not expected, the weak promotion section is also associated with higher Exam 2 scores. This result is surprising given that no promotion occurred during this block of classes; the section variable was is included only to facilitate comparison with the model covering the next block of classes.

A similar regression was performed for the block of classes between Exam 2 and Exam 3. In this model, Exam 3 is the dependent variable, and the independent variables are Exam 2, lecture views (between Exam 2 and Exam 3), and section. This model is also significant, with p <.001 and a value of multiple R2 of.3381. Again, standard diagnostic tests indicated that neither autocorrelation nor multicollinearity were present in the model. Table [NaN] reports the coefficients and significance for each of the variables in the model.

Regression results for Exam 3 model

VariableCoefficientStd. Errort‐Statisticp Value
Intercept25.749.092.83.0060001
Exam 2.58.115.251.75e‐060001
Lecture views.21.131.69.0970001
Section weak−3.212.61−1.23.222 ns

  • 7 Note
  • 8 ***p <.001, **p <.01, *p <.05, +p <.1, ns p ≥.1.

These results again indicate a strong association between performance on prior and current exams. Even accounting for this effect, however, the number of lecture capture video views also had a significant effect on Exam 3 performance (α =.10). Interestingly, the section variable was not statistically significant in this model even though the model covered the block of classes in which promotion occurred. The negative coefficient of the section variable is as expected: in other words, the weak promotion section is associated with lower Exam 3 scores. Collectively, these results provide support for hypothesis H1: more lecture capture access is associated with better learning outcomes.

To examine how efforts to promote lecture capture use affects student usage, access patterns in the period between the first and second exam were examined. This block of classes is referred to as block A. There were eight class sessions within this block but one session was held in the computer lab with no lecture capture made, thus seven total sessions were recorded. Blackboard automatically tracks the usage of all content on the course web site, so it was possible to determine how many times a particular lecture capture session was accessed and for how long. It was also possible to determine how many times the corresponding lecture notes were accessed. For each lecture, access to the materials for a period of one week (beginning the day of the lecture) was examined. Table [NaN] reports the number of times the lecture capture links were accessed, how long the access lasted, and the number of times the lecture notes were accessed during class block A.

Summary of lecture capture access during class block A

Lecture CaptureAverage ViewLecture
Class sessionLink accessTime (min:sec)Note views
A10N/A14
A2817:5016
A3400:1710
A4300:352
A51009:5023
A61410:3564
A7724:1530
Average6.609:0322.7

These numbers should be interpreted with care. For example, simply accessing the lecture capture link does not necessarily mean that a student watched the video. Also, the item‐level tracking mechanism does not indicate who is accessing the link; some students may access the links multiple times. One might assume that access would increase after a lecture covering particularly challenging material or after a lecture with low attendance. Access did become more frequent and average viewing time became longer as the end of a module approached, potentially indicating students’ concerns about the need to review the material prior to an exam.

Access patterns during the period between the second and third exams were also examined. This block of classes is referred to as block B, and corresponded to the time period during which extra credit quizzes were given. In lieu of attending the second class session in block B, students were asked to view a previously recorded lecture, and the first extra credit quiz given based on this material. Table [NaN] reports the number of times the lecture capture links were accessed, how long the access lasted, and the number of times the lecture notes were accessed. Statistics cover a one‐week time frame, beginning on the day the lecture was given.

Summary of lecture capture access during class block B

Lecture CaptureAverage ViewLecture
Class sessionLink accessTime (min:sec)Note views
B11013:258
Bx6318:2053
B21020:0820
B3604:1422
B41121:3824
B50N/A44
B61503:4351
B7627:1257
Average8.315:0332.3

  • 9 Note
  • 10 Students were required to view class Bx, so the values for this class are not included in the averages.
  • 11 Indicates lectures upon which extra credit quizzes were based.

Note that the averages reported in the table do not include the values for the required session. Again, the numbers must be interpreted with care, but several interesting patterns are evident. First, a large majority of students—63 of 70—accessed the required class (labeled “Bx” in the table). Second, the lecture videos associated with extra credit quizzes were accessed more than the nonquiz‐related lectures. Third, there appears to be a marked increase in views of the lecture notes after class Bx. These results suggest support for hypothesis H2.

In addition to the class‐session‐level analysis, student‐level changes in access to the lecture capture materials can shed light on hypothesis H2. The tracking mechanism on Blackboard makes it possible to see when a student enters the lecture capture links area and accesses one of the links. However, since all of the lecture links are in the same area, it is not possible to tell which of the links were accessed. For example, over a one‐week time frame, a student may access multiple lectures. In addition, the lecture notes are located in folders with other materials for a given week (e.g., handouts, example problems, etc.). Nevertheless, student‐level data regarding lecture capture video and class notes access should enable an examination of changes over time. To be consistent, video or file access during the one‐week period in which students were required to view a lecture was not recorded. Using a standard one tailed t test, differences in video access and file access during class block A and class block B were examined, the hypothesis being that both video and file access will be greater in block B. Results are reported in Table [NaN] .

Comparison of video and file access between class blocks A and B—both sections

Video AccessFile Access
Block ABlock BBlock ABlock B
Mean6.638.4628.5744.26
Variance50.44108.191320.422109.36
Observations70707070
Pearson correlation.58.70
Hypothesized mean difference00
Degrees of freedom6969
t Statistic−1.79−3.99
P(T ≤ t) one‐tail.038600017.92e‐050001
t Critical one‐tail1.671.67

  • 12 Note
  • 13 ***p <.001, **p <.01, *p <.05, +p <.1, ns p ≥.1.

These results indicate that there is a significant increase in lecture capture video access and file access during class block B. Again, note that this increase does not include the required access to class Bx. (The small difference in mean video access compared to the numbers reported in Tables [NaN] and [NaN] is due to the fact that in this analysis data is tracked by student rather than by item.) The student‐level analysis thus also provides support for hypothesis H2.

What explains the increase in lecture capture video and file access—the required lecture capture session, the extra‐credit quizzes, or the promotion? To explore the impact of being in the weak or strong promotion section, additional tests were performed to determine whether the difference in the mean number of video and file accesses was significantly different with respect to the sections. Since different students were in each section, a two‐sample t test was used. Examination of class block A showed that the mean number of video and file accesses was not significantly different when comparing the weak and strong promotion sections (p values greater than.1 for both video and file access). This result is as expected, since no promotional efforts were made during class block A. There was however a statistically significant increase in video link access for the strong promotion section between class blocks A and B. File access was also higher for the strong promotion section; however, the difference is not statistically significant.

DISCUSSION

The results of the regression models indicated a positive association between access to lecture capture videos and performance on exams, thus hypothesis H1 is supported. The fact that performance on previous exams was controlled for increases confidence in the result. Interestingly, the association between lecture views and exam performance is stronger in class block A (before the required lecture view and optional extra credit quizzes) than in class block B.

The analysis reported in Tables [NaN] to [NaN] indicates that access to lecture capture materials does indeed increase during class block B. Requiring a lecture capture view and offering optional extra credit quizzes increased student usage of the lecture capture material, thus hypothesis H2 is also supported by the analysis. It is interesting to note that the increase in file access is larger than the increase in video access between class blocks A and B. This suggests that once students realized that lecture notes were available, they made more use of the notes. Simply telling and showing students about the availability of lecture material was not as effective as requiring them to view a lecture video. The assignment seems to have made students more aware of the lecture capture materials. A caveat however is that file access numbers reported in Table [NaN] included access to all files—not just lecture capture notes. Due to the way content was organized in Blackboard, it was not possible to determine access to the lecture capture notes on a student‐level basis. However, the item‐level numbers reported in Tables [NaN] and [NaN] refer specifically to the lecture capture notes.

The results reported in Table [NaN] indicate that the type of promotion used matters. The optional extra‐credit quizzes, as well as the additional promotion of lecture capture use, began in class block B, after the required lecture capture view assignment. Both the weak and strong promotion sections were informed about the extra‐credit quizzes, but the strong promotion section was reminded two to three times and also told about the explicit connection between lecture capture and the quiz content. Although video link access increased more for the strong promotion section (as compared to the weak promotion section), the difference between sections with respect to the number of file views (including lecture capture notes) was not statistically significant. This suggests that the verbal reminders in the strong promotion section did not have a significant impact. This, in turn, supports that idea that simply telling students about the lecture capture does not translate into increased usage.

Comparison of video and file access between weak and strong promotion sections—class block B

Video AccessFile Access
WeakStrongWeakStrong
Mean6.4310.4941.9146.60
Variance73.72137.371659.492609.95
Observations35353535
Hypothesized mean difference00
Degrees of freedom6265
t Statistic−1.65−.4242
P(T ≤ t) one‐tail.0518+.3364 ns
t Critical one‐tail1.671.67

  • 15 Note
  • 16 ***p <.001, **p <.01, *p <.05, +p <.1, ns p ≥.1.
CONCLUSIONS

This article describes the results of a study examining connections between the use of lecture capture materials and student performance. The first question was is there a link between the use of lecture capture materials and learning outcomes? Linear regression showed that there is an association between lecture capture video access and performance on exams even after controlling for performance on previous exams. The second question was how can instructors increase the use of lecture capture materials by students? Three approaches were explored: (1) requiring students to view a previously recorded lecture, (2) offering extra‐credit quizzes based on lecture captures, and (3) reminding students about the close connection between specific lecture capture content and material covered on extra‐credit quizzes. Student access to lecture capture materials increased significantly after the required lecture viewing, and access to files (including lecture capture notes) also increased after students were motivated to take advantage of lecture capture materials.

The most insightful result from this study is how much access to lecture capture materials increased after requiring students to view a previously recorded lecture. Doing so encouraged them to “tune in” to the availability of videos and notes, thus, the most effective way to increase students’ awareness about lecture capture materials may be simply to require access as part of an assignment. Once they actually see and engage with what is available, students seem to be more likely to make use of lecture capture resources.

Based on the results of this study, we plan to make at least two changes with respect to LCT use in our classes. First, we plan to require LCT access early in the semester to make students aware of the availability of materials. Merely telling them about it is not enough. Second, we plan to put lecture notes and video links in the same folder on Blackboard. To facilitate tracking for the study, we put the video links in a separate folder, but this may have affected access.

Instructors from many disciplines can benefit from the insights from this study. Nevertheless, the study does have limitations. Item tracking needs to be fine‐tuned to get a more detailed picture of what materials students are looking at and when. While the current study relied solely on objective data, surveying student attitudes, particularly about the timing and weight of extra‐credit quizzes, may be of value. The analysis could also be enriched by examining a broader sample by expanding the number of students, collecting data from other courses at our university, or perhaps collecting data from other universities. As past studies indicate, factors such as discipline, teaching style, and student level may also have an impact on results. Ideally, the analysis would also be carried out with a control group that would not be given any prompting to view captured lectures. These issues highlight some of the challenges of doing empirical research in real time—it is difficult to design a rigorous, scientific study around the constraints of teaching real classes to real students. Additional studies will help students and instructors alike get the most value possible from lecture capture technology.

Footnotes 1 The authors gratefully acknowledge the support of the Instructional Technology Services group at the University of Massachusetts Lowell. REFERENCES Bollmeier, S. G., Wenger, P. J., & Forinash, A. B. ( 2010 ). Impact of online lecture‐capture on student outcomes in a therapeutics course. American Journal of Pharmaceutical Education, 74 ( 7 ), 1 – 6. 2 Brooks, C., Epp, C. d., Logan, G., & Greer, J. ( 2011 ). The who, what, when, and why of lecture capture. In ACM International Conference Proceeding Series, 86 – 92. doi: 10.1145/2090116.209012 3 Cooke, M., Watson, B., Blacklock, E., Mansah, M., Howard, M., Johnston, A., Tower, M., & Murfield, J. ( 2012 ). Lecture Capture: first year student nurses’ experiences of a web‐based lecture technology. Australian Journal of Advanced Nursing, 29 ( 3 ), 14 – 21. 4 Davis, S., Connolly, A., & Linfield, E. ( 2009 ). Lecture capture: making the most of face‐to‐face learning. Engineering Education, 4 ( 2 ), 4 – 13. 5 DeSantis, L., Pantalone, C., & Wiseman, F. ( 2010 ). Lecture capture—an emerging and innovative technology with multiple applications for business schools. Business Education Innovation Journal, 2 ( 2 ), 6 – 13. 6 Dey, E., Burn, H., & Gerdes, D. ( 2009 ). Bringing the Classroom to the Web: Effects of Using New Technologies to Capture and Deliver Lectures. Research in Higher Education, 50 ( 4 ), 377 – 393. doi: 10.1007/s11162‐009‐9124‐0 7 Dibacco, P., Hetherington, V., & Putman, D. ( 2012 ). Lecture capture: enhancing learning through technology at the Kent State University college of podiatric medicine. Journal of the American Podiatric Medical Association, 102 ( 6 ), 491 – 498. 8 Drouin, M. A. ( 2014 ). If you record it, some won't come: using lecture capture in introductory psychology. Teaching of Psychology, 41 ( 1 ), 11 – 19. 9 Euzent, P., Martin, T., Moskal, P., & Moskal, P. ( 2011 ). Assessing student performance and perceptions in lecture capture vs. face‐to‐face course delivery. Journal of Information Technology Education, 10, 295 – 307. 10 Folley, D. ( 2010 ). The lecture is dead long live the e‐lecture. Electronic Journal of E‐Learning, 8 ( 2 ), 93 – 100. 11 Ford, M. B., Burns, C. E., Mitch, N., & Gomez, M. M. ( 2012 ). The effectiveness of classroom capture technology. Active Learning in Higher Education, 13 ( 3 ), 191 – 201. doi: 10.1177/1469787412452982 12 Fu, J. S. ( 2013 ). ICT in education: a critical literature review and its implications. International Journal of Education and Development using Information and Communication Technology, 9 ( 1 ), 112 – 125. 13 Gorissen, P., Bruggen, J. van, & Jochems, W. ( 2012 ). Students and recorded lectures: survey on current use and demands for higher education. Research in Learning Technology, 20 ( 3 ), 297 – 311. 14 Johnston, A. N. B., Massa, H., & Burne, T. H. J. ( 2013 ). Digital lecture recording: a cautionary tale. Nurse Education in Practice, 13 ( 1 ), 40 – 47. doi: 10.1016/j.nepr.2012.07.004 15 Karnad, A. ( 2013 ). Student use of recorded lectures: a report reviewing recent research into the use of lecture capture technology in higher education, and its impact on teaching methods and attendance, Technical Report, London School of Economics and Political Science, ( http://eprints.lse.ac.uk/50929/ ). 16 Kirkwood, A., & Price, L. ( 2013 ). Technology‐enhanced learning and teaching in higher education: what is “enhanced” and how do we know? A critical literature review. Learning, Media and Technology. doi: 10.1080/17439884.2013.770404 17 Lecture Capture Yet to Take Hold. ( 2011 ). BizEd, 10 ( 5 ), 68 – 69. 18 Mayer, R. E. ( 2005 ). Cognitive theory of multimedia learning. In The Cambridge handbook of multimedia learning, Cambridge : Cambridge University Press, 31 – 48. 19 Mayer, R. E. ( 2009 ). Multimedia learning ( 2nd ed. ). Cambridge : Cambridge University Press. 20 Missildine, K., Fountain, R., Summers, L., & Gosselin, K. ( 2013 ). Flipping the classroom to improve student performance and satisfaction. Journal of Nursing Education, 52 ( 10 ), 597 – 599. doi: 10.3928/01484834‐20130919‐03 21 Nashash, H. A., & Gunn, C. ( 2013 ). Lecture capture in engineering classes: bridging gaps and enhancing learning. Journal of Educational Technology and Society, 16 ( 1 ), 69 – 78. 22 Newton, G., Tucker, T., Dawson, J., & Currie, E. ( 2014 ). Use of lecture capture in higher education—lessons from the trenches. TechTrends: Linking Research and Practice to Improve Learning, 58 ( 2 ), 32 – 45. 23 Owston, R., Lupshenyuk, D., & Wideman, H. ( 2011 ). Lecture capture in large undergraduate classes: student perceptions and academic performance. Internet and Higher Education, 14 ( 4 ), 262 – 268. doi: 10.1016/j.iheduc.2011.05.006 24 Pons, D., Walker, L., Hollis, J., & Thomas, H. ( 2012 ). Evaluation of student engagement with a lecture capture system. Journal of Adult Learning Aotearoa New Zealand, 40 ( 1 ), 79 – 91. 25 Preston, G., Phillips, R., Gosper, M., McNeill, M., Woo, K., & Green, D. ( 2010 ). Web‐based lecture technologies: highlighting the changing nature of teaching and learning. Australasian Journal of Educational Technology, 26 ( 6 ), 717 – 728. 26 Rogers, R. R. H., & Cordell, S. ( 2011 ). An examination of higher education students’ opinions of the lecture capture system tegrity. Journal of Technology Integration in the Classroom, 3 ( 1 ), 75 – 90. 27 Shaw, G. P., & Molnar, D. ( 2011 ). Non‐native English language speakers benefit most from the use of lecture capture in medical school. Biochemistry and Molecular Biology Education, 39 ( 6 ), 416 – 420. 28 Smith, C. M., & Sodano, T. M. ( 2011 ). Integrating lecture capture as a teaching strategy to improve student presentation skills through self‐assessment. Active Learning in Higher Education, 12 ( 3 ), 151 – 162. 29 Stroup, M. D., Pickard, M. M., & Kahler, K. E. ( 2012 ). Testing the effectiveness of lecture capture technology using prior GPA as a performance indicator. Teacher‐Scholar, 4 ( 1 ), 43 – 54. 30 Tan, K. T., Wong, E., & Kwong, T. ( 2011 ). Piloting lecture capture: An experience sharing from a Hong Kong university, In Communications in Computer and Information Science Conference Proceedings, CCIS:177, 268 – 279. 31 Toppin, I. N. ( 2011 ). Video lecture capture (VLC) system: a comparison of student versus faculty perceptions. Education and Information Technologies, 16 ( 4 ), 383 – 393. doi: 10.1007/s10639‐010‐9140‐x 32 Turney, C. S. M., Robinson, D., Lee, M., & Soutar, A. ( 2009 ). Using technology to direct learning in higher education: the way forward? Active Learning in Higher Education, 10 ( 1 ), 71 – 83. doi: 10.1177/1469787408100196 33 Vajoczki, S., Watt, S., Marquis, N., Vine, M., & Liao, R. ( 2011 ). Students approach to learning and their use of lecture capture. Journal of Educational Multimedia and Hypermedia, 20 ( 2 ), 195 – 214. 34 Veeramani, R., & Bradley, S. ( 2008 ). U‐W Madison Online‐learning Study: Insights Regarding Undergraduate Preference for Lecture Capture, ( http://www.uwebi.org/news/uw‐online‐learning.pdf ). 35 vonKonsky, B. R., Ivins, J., & Gribble, S. J. ( 2009 ). Lecture attendance and web based lecture technologies: a comparison of student perceptions and usage patterns. Australasian Journal of Educational Technology, 25 ( 4 ), 581 – 595. 36 Zhu, E. and Bergom, I. ( 2010 ). Lecture Capture: A Guide for Effective Use. University of Michigan Center for Research on Learning and Teaching, Report 27, ( http://www.crlt.umich.edu/publinks/CRLT%5fno27.pdf )

By Thomas W. Sloan and David A. Lewis

Thomas W. Sloan is an Associate Professor in the Manning School of Business at the University of Massachusetts Lowell. He teaches courses on management science, operations management, and supply chain management. Dr. Sloan's research interests are in the areas of operations management and using technology to enhance learning.

David A. Lewis is a Professor of Operations and Information Systems. He teaches courses on operations management, quality control, international management, and management information systems. Professor Lewis's research interests include total quality control, using technology in the classroom, and distance learning. He has been teaching online courses for 18 years.

Titel:
Lecture Capture Technology and Student Performance in an Operations Management Course
Autor/in / Beteiligte Person: Sloan, Thomas W. ; Lewis, David
Link:
Zeitschrift: Decision Sciences Journal of Innovative Education, Jg. 12 (2014-10-01), S. 339-355
Veröffentlichung: Wiley, 2014
Medientyp: unknown
ISSN: 1540-4595 (print)
DOI: 10.1111/dsji.12041
Schlagwort:
  • Instructional technology
  • Computer science
  • Lecture capture
  • ComputingMilieux_COMPUTERSANDEDUCATION
  • Educational technology
  • Public university
  • Business, Management and Accounting (miscellaneous)
  • Decision Sciences (miscellaneous)
  • Research questions
  • Operations management
  • Video technology
  • Education
  • Course (navigation)
Sonstiges:
  • Nachgewiesen in: OpenAIRE
  • Rights: CLOSED

Klicken Sie ein Format an und speichern Sie dann die Daten oder geben Sie eine Empfänger-Adresse ein und lassen Sie sich per Email zusenden.

oder
oder

Wählen Sie das für Sie passende Zitationsformat und kopieren Sie es dann in die Zwischenablage, lassen es sich per Mail zusenden oder speichern es als PDF-Datei.

oder
oder

Bitte prüfen Sie, ob die Zitation formal korrekt ist, bevor Sie sie in einer Arbeit verwenden. Benutzen Sie gegebenenfalls den "Exportieren"-Dialog, wenn Sie ein Literaturverwaltungsprogramm verwenden und die Zitat-Angaben selbst formatieren wollen.

xs 0 - 576
sm 576 - 768
md 768 - 992
lg 992 - 1200
xl 1200 - 1366
xxl 1366 -