HOUSTON – (March 2, 2026) – Autonomous vehicles (AVs) are becoming increasingly common on roadways, but making them as safe as possible may entail going beyond the particular specs of the vehicles themselves to upgrading the roadway infrastructure.
EyeDAR, a low-power millimeter-wave radar sensor roughly the size of an orange, could provide radar-equipped AVs with critical inputs about surrounding traffic, extending and enhancing the vehicles’ sensing accuracy.
Placed at key points such as streetlights and intersections, these low-profile, inexpensive sensors could ensure that AVs never fail to pick up on emergent obstacles, even when they are not within proper range for the vehicles’ onboard sensors or when visibility is severely limited.
Kun Woo Cho , a postdoctoral researcher at Rice University who leads the EyeDAR research project, introduced the technology at HotMobile , The International Workshop on Mobile Computing Systems and Applications, which took place in Atlanta Feb. 25-26.
“Current automotive sensor systems like cameras and lidar struggle with poor visibility such as you would encounter due to rain or fog or in low-lighting conditions,” said Cho, who works in the lab of Ashutosh Sabharwal , Rice’s Ernest Dell Butcher Professor of Engineering and professor of electrical and computer engineering. “Radar, on the other hand, operates reliably in all weather and lighting conditions and can even see through obstacles.”
Radar systems transmit signals in a given direction, and when that signal encounters an obstacle in its path, part of it reflects back to the source, carrying information about the obstacle. However, only a small fraction of the radar signal emitted is reflected back, and most of it actually bounces away from the source device.
In the context of self-driving vehicles, this means that a large fraction of the radar signal their sensing stack emits scatters away from the vehicle, leaving them with an incomplete view of their surroundings. Pedestrians emerging from behind large vehicles, cars creeping forward at intersections or cyclists approaching at odd angles can easily go unnoticed.
Thanks to its placement on roadside infrastructure such as traffic lights, stop signs or streetlights, EyeDAR can capture radar reflections that would otherwise be lost. The device’s unique structure allows it to determine the direction of reflected signals and report that information back to self-driving vehicles.
“It is like adding another set of eyes for automotive radar systems,” said Cho, who specializes in metamaterial antenna design.
EyeDAR boasts a simple, elegant design inspired by a highly efficient real-world sensor ⎯ the human eye. The device consists of two main components: a 3D-printed Luneberg lens made from resin which functions similarly to the lens of the eye, focusing incoming signals from any direction onto a focal point on the opposite surface; and an antenna array surrounding the lens on the back end which functions like a retina, detecting the signal and determining its direction.
Whereas conventional radar systems rely on large antenna arrays and complex algorithms to estimate angles, EyeDAR’s physical design does most of the computation work typically required for direction finding ⎯ one of the most power- and data-intensive tasks in radar processing.
“Our lens consists of over 8,000 uniquely shaped, extremely small elements with a varying refractive index,” Cho said.
Through an intentional distribution of these elements, the lens structure interacts with incoming radar signals in a smart way, routing them to the right spot on the antenna array. The approach has proved fruitful: In testing, EyeDAR was able to resolve target directions more than 200 times faster than traditional radar designs.
Moreover, EyeDAR communicates what it sees without transmitting new signals. Instead, the sensor alternates between absorbing incoming radar waves and reflecting them back to the source radar in a form it can interpret as a sequence of 0s and 1s.
“Like blinking Morse code,” Cho said. “EyeDAR is a talking sensor ⎯ it is a first instance of integrating radar sensing and communication functionality in a single design.”
This combination of sensing and communication in a compact, inexpensive and low-power architecture makes it feasible to deploy large numbers of sensors across roadways. In the case of self-driving cars, the system promises to be especially useful in dense, high-traffic urban settings. However, the potential application space is much wider: EyeDAR could be integrated into robots, drones and wearable platforms. Networks of these sensors could also share information with one another, allowing each device to see well beyond its own range of sight.
Cho said she is particularly interested in what the system represents from a computing standpoint. As autonomous systems increasingly interact directly with people, Cho argues that intelligent physical design will have to complement artificial intelligence.
“EyeDAR is an example of what I like to call ‘analog computing,’” Cho said. “Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.”
The research was supported in part by the National Science Foundation (2346550). The content in this press release is solely the responsibility of the authors and does not necessarily represent the official views of funding entities.
-30-
This news release can be found online at news.rice.edu .
Follow Rice News and Media Relations via Twitter @RiceUNews .
Peer-reviewed paper:
EyeDAR: A Low-Power mmWave Tag that Senses and Communicates 3D Point Clouds to Enhance Radar Perception | HotMobile , The International Workshop on Mobile Computing Systems and Applications
Authors: Kun Woo Cho, Yaxiong Xie and Ashutosh Sabharwal
Video is available at:
https://www.youtube.com/watch?v=B1SWRzGPQJQ
Video by Jared Jones/Rice University
Access associated media files:
https://rice.box.com/s/3tl7ix0n4c5gnvw81bxl3voqkyovql6n
Credit: Photos by Jared Jones/Rice University
About Rice:
Located on a 300-acre forested campus in Houston, Texas, Rice University is consistently ranked among the nation’s top 20 universities by U.S. News & World Report. Rice has highly respected schools of architecture, business, continuing studies, engineering and computing, humanities, music, natural sciences and social sciences and is home to the Baker Institute for Public Policy. Internationally, the university maintains the Rice Global Paris Center, a hub for innovative collaboration, research and inspired teaching located in the heart of Paris. With 4,776 undergraduates and 4,104 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 7 for best-run colleges by the Princeton Review. Rice is also rated as a best value among private universities by the Wall Street Journal and is included on Forbes’ exclusive list of “New Ivies.”