Time-Lapse Vision: Localizing, Calibrating, and Using Thousands of Outdoor Webcams

Dr. Nathan Jacobs, Computer Science Department, Washington University


The web has an enormous collection of live cameras that capture images of roads, beaches, cities, mountains, buildings, and forests. Over the last five years, I have worked to understand how to effectively use this massively distributed, scalable, and already existing network of cameras as a new global sensor. I created AMOS, the Archive of Many Outdoor Scenes, by recording imagery from 1000 cameras every half hour over the last four years; the size of the dataset recently passed 50 million images. The sheer number of cameras and their broad spatial distribution motivates new questions in computer vision. How can you automatically geo-locate each camera? How can you learn the 3D structure of the scene? How can you correct the color measurements from the camera? What environmental properties can you extract?

I address these questions using models of the geometry and statistics of natural outdoor scenes, and an understanding of how image appearance varies due to transient objects, the weather, the time of day, and the season. I will show algorithms to calibrate cameras and annotate scenes that formalize this understanding in classical linear and nonlinear optimization frameworks. Unlike most work in computer vision which relies on single images or short videos, these algorithms use images captured at long temporal scales, ranging from minutes to years. By exploring natural cues capable of working over such long timescales on a broad range of scenes, I hope to deepen our understanding of the interplay between geographic location, time, and the natural causes of image appearance change. This will open new opportunities in large-scale surveillance for security and environmental monitoring.


Host: Professor J. Griffioen