Submission Date


Document Type




Faculty Mentor

Christopher Tralie


Presented during the 23rd Annual Summer Fellows Symposium, July 23, 2021 at Ursinus College.

Presented as part of the Mathematics REU program supported by the National Science Foundation (NSF), grant number 1851948.

Project Description

Ordinary videos capture a surprising amount of hidden, visually imperceptible information. For instance, videos of peoples' faces may capture color changes in the skin and artery motion from heartbeats, while videos of mechanical systems can capture subtle vibrations indicating imminent failure. Algorithms can extract and exaggerate these signals for visualization on top of the original videos. In particular, Eulerian magnification algorithms sidestep the need to track hidden motions directly and instead devise multiscale bandpass filters to amplify signals in local spatial regions. In this work, we extend these techniques beyond color videos to geometric video data captured by 3D depth cameras such as the Microsoft Kinect. In our framework, we can spatially amplify a "bulging of the neck" during a heartbeat or the expansion of a chest/abdomen during a breath. We then exaggerate and display these signals as evolving 3D shapes. We explore a pipeline based on an implicit surface representation, in which we reconstruct the object in 3D space using multiple camera angles captured by one camera moving around the subject. We discuss the merits, drawbacks, and challenges of this representation compared to ordinary color videos.

Open Access

Available to all.