Skip to main content

​Seeing things differently: Robert Pless


Written by Beth Miller — Robert Pless, PhD, in the University City Loop. Pless has developed software that takes a screen shot from more than 24,000 public webcams worldwide every 30 minutes, including the one near Blueberry Hill restaurant and club.

Webcams have become another constant in our technological society, with views available of natural wonders, stunning vistas or bustling city blocks. But to Robert Pless, images from webcams worldwide are an opportunity to study climate and environmental change and even human behavior.

Pless, PhD, professor of computer science and engineering, is an expert in computer vision, or the study of how to interpret or analyze images. He archives images from 24,000 publicly available webcams around the world to make interpretations about the environment. But these aren’t single images from each camera — undergraduate students in Pless’ lab built software that has been running for seven years that captures a still photo from each camera every half hour. What began with a few cameras has bloomed into an accumulation of 500 million images from the cameras placed at ski resorts, beaches, national parks, intersections and highways, among others, in an archive called AMOS, or Archive of Many Outdoor Scenes (amosweb.cse.wustl.edu). The images are used to study a variety of subjects and are an alternative to satellite images to study large-scale environmental change.

“These cameras are below the clouds, so they can see things that satellites can’t,” Pless says. “More importantly, they see individual trees. Biological change is a global thing, but it happens at the scale of individuals.”

“As public health researchers, we have to go out and observe how people are using spaces. If I were to go to a park and tally what types of activities people are doing, they might change their behavior if they see me sitting there with a clipboard. With the cameras, people know they are there, but they aren’t changing their behavior because of them.” — J. Aaron Hipp, PhD

Satellites take an average of what they see and can’t differentiate the green leaves of a group of trees from the green grass in the summer. So Pless and his collaborators are working to organize all public webcams as ground-level sensors to study phenology, or the annual life cycles of plants and animals, to get better and more accurate estimates of changes in those cycles. They exclusively use images from cameras other people have installed and whose images are already available through their own websites.

This method, which Pless calls passive vision, relies on the interpretation of these series of images over time. Looking at images taken from a webcam over the course of a year or several years shows many changes in the environment. To study this idea, Pless received a prestigious CAREER Award from the National Science Foundation in 2006.

“We can see every day how much snow is in a particular location and correlate the growth of a tree to how much snow is there, or how long it took the snow to melt, or how wet the field is or the drainage off of the streets,” he says. “The idea of taking all of these webcams that are out there and put up by random people for random purposes and organizing them into a resource that can be used for other things excites me,” Pless says.

Robert Pless

Pless’ other university positions include director of the doctoral program and of graduate admissions, Computer Science & Engineering; Computer Science faculty representative to the Graduate Council; Engineering representative to the Undergraduate Council. Photo by Ron Klein

He can also determine where a camera is in the world by watching the images over time and interpreting the time the sun rises, the length of the day and from which direction the sun shines. By studying the pattern of brightness and darkness during the day, Pless can make 3-D models of the scene.

Pless also is working with J. Aaron Hipp, PhD, assistant professor of social work in the Brown School, on a project seeking to understand how people use public spaces and built environments, such as parks, sidewalks, crosswalks or bike lanes.

A mutual friend who knew of Hipp’s and Pless’ research interests introduced them after a volleyball game in Forest Park. Hipp was interested in what the AMOS data could reveal about how people use public spaces.

In the fall of 2011, Pless and Hipp received a $25,000 University Research Strategic Alliance (URSA) grant to integrate Pless’ AMOS database images with Hipp’s interest in how people use public spaces. They looked back at six years’ worth of images of six intersections in Washington, D.C., where a bike lane or painted crosswalks had been added during that time frame. They focused on the number of people using the intersections for a month before the crosswalk or bike lane was added and for a month after to determine if activity increased. Analysis of the data showed there was a significant increase in walking and cycling after the crosswalks and bike lanes were added, Hipp says.

Building on that research, the researchers received a two-year, $275,000 grant from the National Cancer Institute that started Jan. 1, 2014, that will allow them to continue using the images from AMOS to study public health.

“Public health is about populations and thinking at a larger scale,” Hipp says. “Robert thinks at a much larger scale — instead of thinking about six cameras in D.C., he says that’s like thinking about six trees in a forest. He is good at challenging me to think big.”

“One of the powerful things about computer science is if you build something, you can very easily share it with the entire world, and if what you build is a tool, you can share the actual ability to do something.” — Robert Pless, PhD

Pless also works on a project called rePhoto (projectrephoto.com), an app that allows people to take repeated photographs of something over time. For example, a user could take a photo of one of many trees lining Melville Avenue near the Danforth Campus. A week later, that same person, or someone else, could take another photo of the same tree from exactly the same perspective as a previous image. The rePhoto app shows the previous picture transparently in half of the screen so that a new picture can be accurately aligned. The photos are uploaded to a database, where they can be analyzed for changes.

In another application of computer vision, Pless and members of his lab used their techniques to help with a cold crime case in 2013. An unidentified young girl was murdered in 1983 and was buried in a local cemetery in an unmarked grave. To find the body, Abby Stylianou, a research associate in Pless’ lab, took old newspaper photos taken at the girl’s funeral and compared them with current aerial images from the U.S. Geological Survey.

Using items in the newspaper photos, such as a tree near the gravesite and a billboard that no longer stands, the team used algorithms developed in Pless’ lab to find points in the photos and match them to points currently at the cemetery, then made a model to determine from where the original photos were taken in the cemetery. Based on their modeling, the St. Louis Medical Examiner found the grave 8 inches from where Stylianou said it would be.

“Building tools to let other people take advantage of their creativity leads to endpoints you’d never think of,” Pless says.



Engineering Momentum is published by the School of Engineering & Applied Science at Washington University in St. Louis. Unless otherwise noted, articles may be reprinted without permission with appropriate credit to the publication, school and university.