NEC can be used in Medical AR Assistance and 3D Reconstruction. (Image: HKU)

A research team led by Professor Jia Pan and Professor Yifan Evan Peng from the Department of Computer Science and Department of Electrical & Electronic Engineering under the Faculty of Engineering at the University of Hong Kong (HKU), in collaboration with a researcher at Australian National University, has recently developed a neuromorphic exposure control (NEC) system that revolutionizes machine vision under extreme lighting variations. Published in Nature Communications, this biologically inspired system mimics human peripheral vision to achieve unprecedented speed and robustness in dynamic perception environments.

Traditional automatic exposure (AE) systems rely on iterative image feedback, creating a chicken-and-egg dilemma that fails in sudden brightness shifts (e.g., tunnels, glare). The NEC system solves this by integrating event cameras — sensors that capture per-pixel brightness changes as asynchronous “events” — with a novel Trilinear Event Double Integral (TEDI) algorithm. This approach: Operates at 130 million events/sec on a single CPU, enabling edge deployment.

"Like how our pupils instantly adapt to light, NEC mimics biological synergy between retinal pathways," said First Author Shijie Lin. "By fusing event streams with physical light metrics, we bypass traditional bottlenecks to deliver lighting-agnostic vision."

In tests, the team has validated NEC across mission-critical scenarios:

  1. Autonomous Driving: Improved detection accuracy (mAP +47.3 percent) when vehicles exit tunnels into blinding sunlight.
  2. Augmented Reality (AR): Achieved 11 percent higher pose estimation (PCK) for hand tracking under surgical lights.
  3. 3D Reconstruction: Enabled continuous SLAM in overexposed environments where conventional methods fail.
  4. Medical AR Assistance: Maintained clear intraoperative visualization despite dynamic spotlight adjustments.

Professor Jia Pan said, "This breakthrough represents a significant leap in machine vision by bridging the gap between biological principles and computational efficiency. The NEC system not only addresses the limitations of traditional exposure control but also paves the way for more adaptive and resilient vision systems in real-world applications, from autonomous vehicles to medical robotics."

NEC Core: Achieve rapid and efficient exposure control by breaking loop dependency with neuromorphic events. (Image: HKU)

Peng commented, "Our collaborative work has been instrumental in pushing the boundaries of neuromorphic engineering. By leveraging event-based sensing and bio-inspired algorithms, we’ve created a system that is not only faster but also more robust under extreme conditions. This is a testament to the power of interdisciplinary research in solving diverse complex engineering challenges."

In the long term, the NEC paradigm offers a novel event-frame processing scheme that reduces the processing burden of high-resolution events/images and incorporates bio-plausible principles into the low-level control of the machine eyes. This opens new avenues for camera design, system control, and downstream algorithms. The team’s success in embodying neuromorphic synergy in various systems is a milestone that can inspire many optical/image/neuromorphic processing pipelines and implies direct economic and practical implications for the industry.

Here is an exclusive Tech Briefs interview, edited for length and clarity, with Lin, Pan, and Peng of the Department of Computer Science, The University of Hong Kong.

Tech Briefs: What was the biggest technical challenge you faced while developing this NEC system?

Team: The goal of the NEC system is to use event cameras to achieve ultra-high-speed light perception, enabling accurate exposure for frame imaging at high speeds. One major challenge was the massive data output from event cameras, which can produce hundreds of millions of events per second in typical scenarios. This posed significant computational challenges, especially for real-time processing on low-power edge devices. To overcome this, we developed a completely new integration framework. This framework not only compensates for the quantization errors inherent in event cameras but also achieves ultra-high computational efficiency for billions of events. This breakthrough made the NEC system possible.

NEC can be applied on moving vehicles and headsets. (Image: HKU)

Tech Briefs: Can you explain in simple terms how it works?

Team: NEC uses event cameras to quickly estimate changes in light intensity in a scene. It then adjusts the camera's exposure parameters in real time. This ensures that the camera's settings are always optimized to capture clear images.

Tech Briefs: Do you have any set plans for next steps?

Team: Our next step is to collaborate with autonomous driving companies to integrate this technology into every car camera. The goal is to enable autonomous vehicles to capture clear images even in extremely challenging lighting conditions, improving safety and saving lives.

Tech Briefs: Is there anything else you’d like to add that wasn’t touched upon?

Team: For a long time, automatic exposure control has been a relatively overlooked research area. However, it’s critically important because every camera relies on automatic exposure to adjust its settings and provide clear images. The NEC algorithm is the result of years of collective effort in this field. I’d like to take this opportunity to thank all the researchers who have worked tirelessly in this area.

Tech Briefs: Do you have any advice for researchers aiming to bring their ideas to fruition?

Team: NEC is just the beginning. I believe future exposure control algorithms will be hybrid and diverse. By leveraging the high-speed characteristics of event data, we can introduce spiking neural networks and design more sophisticated adjustment mechanisms to achieve even better control. I invite everyone to join us in this exciting journey. The code for this project has already been open-sourced, and we welcome others to use it.