Tracking Human Expression Actions in Lectures
Tilmann Steinberg, Li Shen, Ling Cheng, Fillia Makedon, Dartmouth College, United States
EdMedia + Innovate Learning, in Montreal, Canada ISBN 978-1-880094-40-2 Publisher: Association for the Advancement of Computing in Education (AACE), Waynesville, NC
Recorded multimedia presentations in the form of lectures, conference proceedings, commercial demonstrations or tutorials, are becoming increasingly available in multiple, diverse, and often heterogeneous formats. These materials are stored in databases or digital libraries where the user can search by a variety of mechanisms, ranging from keyword search to content-based que-ries. Tracking human expression actions is a valuable tool for mining information because it can add significance value to the materials being searched. In this paper, we focus on tracking the pre-senter's pointing activity during a lecture, and show how this information can be used to provide the user with an improved overview of the lecture to quickly navigate to those points that the user might find the most interesting. The strength of this approach lies in the ability to correlate the pointing actions of the lecturer with other streams of data during the recorded lectures, such as audio amplitude, gesture motion, slide presentation, or text. In the absence of complete transcripts or manually added annotations, which are expensive to generate given the quantity of recorded lectures, the level and type of interaction of the presenter with the material can be analyzed to yield likely points of interest.
Steinberg, T., Shen, L., Cheng, L. & Makedon, F. (2000). Tracking Human Expression Actions in Lectures. In J. Bourdeau & R. Heller (Eds.), Proceedings of ED-MEDIA 2000--World Conference on Educational Multimedia, Hypermedia & Telecommunications (pp. 1090-1095). Montreal, Canada: Association for the Advancement of Computing in Education (AACE).
© 2000 Association for the Advancement of Computing in Education (AACE)