Self-driving cars could account for 21 million new vehicles sold every year by 2035. Over the next decade alone such vehicles — and vehicles with assisted-driving technology —could deliver $1 trillion in societal and consumer benefits due to their improved safety.
For autonomous vehicles to make good on that promise they will need onboard artificial intelligence (AI) technology able to link them to highly detailed maps that reflect every change in the status of lanes, hazards, obstacles, and speed-limits in real time.
Researchers at the NYU Tandon School of Engineering are making this critical machine-to-machine handshake possible. Yi Fang, a research assistant professor in the Department of Electrical and Computer Engineering and a faculty member at NYU Abu Dhabi, and Edward K. Wong, an associate professor in the NYU Tandon Department of Computer Science and Engineering, are developing a deep learning system that will allow self-driving cars to navigate, maneuver, and respond to changing road conditions by mating data from onboard sensors to information on HERE HD Live Map, a cloud-based service for automated driving. The NYU Multimedia and Visual Computing Lab directed by Professor Fang will house the collaborative project.
Fang and Wong recently received a gift fund from HERE, a global leader in mapping and location-based services owned by Audi, BMW, Daimler and Intel, with Tencent and NavInfo of China and GIC of Singapore also poised to become investors during 2017. NYU Tandon is one of HERE’s first university research and development partners in HERE HD Live Map.
High-definition maps meant for machine-to-machine communication must be accurate to within 10 to 20 centimeters. Self-driving vehicles need to continuously update, or register, their location on these maps with an equally high degree of accuracy, according to Fang, who said that the goal of the collaborative research is to enhance car-to-map precision to within 10 centimeters.
To read the full release, please visit HERE.