Autofocus for Event Cameras
Shijie Lin
Yinqiang Zhang
Lei Yu
Bin Zhou
Xiaowei Luo
Jia Pan
[Paper]
[Appendix]
[GitHub]
[Dataset]
[Youtube]
[Bilibili]
Our event-based autofocus system consists of an event camera and a motorized varifocal lens. It leverages the proposed event-based focus measure and search method to focus the camera to the optimal focal position. When appropriately focused, the event camera's imaging result (b) is sharper and more informative than (a), (c) where it is defocused.

Abstract

Focus control (FC) is crucial for cameras to capture sharp images in challenging real-world scenarios. The autofocus (AF) facilitates the FC by automatically adjusting the focus settings. However, due to the lack of effective AF methods for the recently introduced event cameras, their FC still relies on naive AF like manual focus adjustments, leading to poor adaptation in challenging real-world conditions. In particular, the inherent differences between event and frame data in terms of sensing modality, noise, and temporal resolutions bring many challenges in designing an effective AF method for event cameras. To address these challenges, we develop a novel event-based autofocus framework consisting of an event-specific focus measure called event rate (ER) and a robust search strategy called event-based golden search (EGS). To verify the performance of our method, we have collected an event-based autofocus dataset (EAD) containing well-synchronized frames, events, and focal positions in a wide variety of challenging scenes with severe lighting and motion conditions. The experiments on this dataset and additional real-world scenarios demonstrated the superiority of our method over state-of-the-art approaches in terms of efficiency and accuracy.


Pipeline

Our event-based autofocus system first traverses all possible focal positions from minimum to maximum to collect events data. (b) Then the system will use the event-based golden search (EGS) in cooperation with the event-based focus measure to find the optimal focal position and (c) adjust the lens accordingly.

CVPR 2022 Oral Talk


Supp Video


Dataset

Demo

From left to right, our system accurately focus the camera in low light (marked by gray) in two situations with and without scenes motions.


Citation

Please cite our work if you find the dataset or our code helpful.
@inproceedings{lin2022autofocus,
	title={Autofocus for Event Cameras},
	author={Shijie, Lin and Yinqiang, Zhang and Lei, Yu and Bin, Zhou and Xiaowei, Luo and Jia, Pan},
	booktitle={The IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
	year={2022},
}



Acknowledgements

This project is supported by HKSAR RGC GRF 11202119, 11207818, T42-717/20-R, HKSAR Technology Commission under the InnoHK initiative, National Natural Science Foundation of China, Grant 61871297, and the Natural Science Foundation of Hubei Province, China, Grant 2021CFB467.