Surveillance cameras are all around us, from the convenience store down the street to the grandest casinos in Las Vegas.

Those cameras show what is happening, but they don’t process why it is happening. They need a person to do that. But what if we could program cameras to comprehend certain scenarios – an abandoned suitcase in an airport terminal, for example – and then relay that information wirelessly to other cameras in their network?

Christophe Bobda is making it possible.

Bobda, professor in the Department of Computer Science and Computer Engineering, is adapting existing algorithms to craft forms of “intelligence” in a new wave of smart cameras.

Professor Christophe Bobda in his lab at the University of Arkansas. | Photo by University Relations

Professor Christophe Bobda in his lab at the University of Arkansas. | Photo by University Relations

“Current camera surveillance systems cannot provide the bandwidth required to transport increasingly higher amounts of data,” Bobda said. “And analyzing terabytes of data in real-time requires computer architectures with capability far beyond the currently available systems.”

Smart cameras address these shortcomings, Bobda said, because they can analyze video data within the camera, which in turn significantly decreases the amount of data that need to be transported.

Bobda’s research team is designing and deploying a set of collaborative, embedded and self-coordinating smart cameras to monitor indoor areas. If successful, their smart cameras could be used in everything from airports to shopping malls, schools and sports venues.

“We don’t transmit the whole image from one camera to the next because this would require an impossible bandwidth,” he said. “The camera is processing the images and finding only a couple of bytes’ worth of features to send to the next camera.”

In one National Science Foundation-funded project, Bobda is working with industry partner R-Dex Systems, which specializes in artificial intelligence and machine learning, to design a wireless smart camera network for mobile robots in large indoor environments. The robots, known as automated guided vehicles, follow markers or wires in the floor and are most often used in industrial applications to move materials around a manufacturing facility or warehouse.

The researchers created ceiling-mounted cameras with overlapping field of views of the indoor environment that will provide guidance to the robots when they enter their areas of coverage. Cameras mounted on the robot are used for immediate near-field navigation and for detecting the sudden appearance of objects or people on a track.

“The advantages of low-cost, improved performance and flexibility that will result from this project will help potential customers, which include users of industrial robots and forklifts to cut their costs, while improving productivity and safety with automation,” Bobda said.

Another application takes place in a more intimate environment: hospital rooms. Bobda is developing a smart camera to prevent falls.

“What we want to do is use a camera as a sensor,” Bobda said. “The camera captures what the patient is doing so if the patient is sleeping too close to the edge of the bed, or trying to sit up, our camera can observe the situation and know exactly what signal to send to the nurse. Or, if the patient has dementia and is moving objects around the room that may present a danger to them, the camera can notify the nurse.”

Bobda co-edited the 2014 book, Distributed Embedded Smart Cameras: Architectures, Design and Applications.