Paving the way for a world with autonomous vehicles

5/8/2017 Emily Scott

A collaborative research project at the University of Illinois aims to help the world become better prepared for autonomous vehicles.

Written by Emily Scott

Ramakrishnan Narayanan, a graduate student in industrial engineering, has spent the majority of his time at ISE working on a project that hopes to create a connected infrastructure between autonomous cars and their surroundings.

Ramakrishnan Narayanan, graduate student at ISE.
Ramakrishnan Narayanan, graduate student at ISE.

Together with Professor Richard Sowers, Professor Daniel Work, the department of Mathematics, and the Illinois Geometry Lab, the research group has created an image recognition algorithm that can identify objects surrounding a car. 

They recorded a video through an on-board camera on a car and collected information using the car’s built-in computer.

Later, they processed the video through the image recognition algorithm, which identified the objects seen in the video: people, other cars, buildings, traffic lights, and more.

Their goal is to develop a product that can do this in real time.

Doing this would turn any car into a semi-autonomous vehicle.

“We got the idea of building a system that could essentially make use of all the information available on cars today,” Narayanan says.

Today, many cars have built-in sensors and information retrieval systems that allow manufacturers to collect data on the speed and direction of travel.

The group wanted to make use of this kind of data to help autonomous driving systems make better decisions.

“Our vision for the future was that it would take about 30 to 50 years for a majority of cars to become autonomous, if at all. But in between, we have the gap where a few of the cars are autonomous and a few are not,” Narayanan says. “So what we need at that time is for cars to talk to each other.”

Their goal became to build a system that can connect cars to each other and to the surrounding infrastructure.

“We want cars to talk to the traffic lights, to pedestrian crossings, to all of that,” Narayanan says.

Doing this could allow cities to analyze real-time traffic situations, measure pedestrian density, and more.

Large cities often have a pedestrian count every four to five years that influences the time on pedestrian crossing lights. This kind of system could allow cities to do that in real time.

“All of this ties into making your cities prepared to have a lot of autonomous cars on the street, or to have a lot of cars that are connected to each other,” Narayanan says. “The next big thing is not the autonomous car, but a car that is a lot more informed than it used to be.”

If even 20 percent of cars on the road in the next ten years are autonomous, cities and roadways will still need to be significantly modified to be autonomous-car friendly.

The best way to do that? Ask the car.

“You have a car that’s collecting a lot of information, and giving you a brief picture of what’s around it, so you can build and modify your cities or your infrastructure to reflect that,” Narayanan says. “The long term goal of our system is to have a connected architecture that can give you real-time information on the current traffic and pedestrian density.”  

The algorithm can identify surrounding objects such as vehicles, buildings and people.
The algorithm can identify surrounding objects such as vehicles, buildings and people.

There are also several potential spinoff applications. For example, Narayanan says they are working on getting the algorithm to recognize bicyclists who are wearing helmets, or to recognize specific cars as they go through intersections.

“It’s all about having this entire infrastructure and your traffic talking to each other, giving each other information in real time about what the situation is,” he says.

The system could also provide insight on human behavior. Currently, many insurance companies install devices on cars to track the driver’s behavior — how fast you’re going, how hard you brake — and determine the cost of the driver’s insurance.

“This can be version two of that,” Narayanan says. “You need more in-depth information on when these things happen. Then you can tell if someone’s an overly cautious driver and is at a greater risk.”

There is still much to be explored with autonomous vehicles, and Narayanan says this specific project is an advancement into the unknown.

“This is something new that, in my knowledge, has not been done a lot before,” he says. “There really hasn’t been research on using data, not just from your cameras but from other sensors within the car, and combining that to build a bigger picture.”

However, as this research continues, he says it will become more difficult to expand the abilities of their image recognition algorithms.

Currently, the algorithms are able to identify objects after being trained with a data set of stock images.

If you want the algorithm to identify a person, you give the algorithm images of different people until it learns what a face looks like.

“That’s easy to do in the current state, but what’s difficult to do is make it identify specific things,” Narayanan says. “That’s where I think we’ll have the greatest challenge.”

Coming into this project, Narayanan says he had little experience in this area of research, but his interest in autonomous cars has allowed him to become proficient.

“I’m a mechanical engineer and I was really interested in automobiles, so this whole idea of an autonomous car was really interesting to me,” he says. “I thought it was a good combination of my interests.”

He hopes he can apply what he’s learned to a career in the industry.

“I want to be able to use machine learning and big data to build models that give you bigger, better insights into making decisions,” he says. “It’s only limited by what you can think of. Machine learning is essentially a way of teaching a machine how to recognize something, and that recognition need not stop at images or something specific. It can be used in so many different applications, and I see myself working on something like that in the future.”

Related Links


Share this story

This story was published May 8, 2017.