Event-based Light Field Project

[ Top Page ] [ Computational Camera Project (Old) ] [ Computational Camera Project (New) ]

Time-Efficient Light-Field Acquisition Using Coded Aperture and Events

We propose a computational imaging method for time-efficient light-field acquisition that combines a coded aperture with an event-based camera. Different from the conventional coded-aperture imaging method, our method applies a sequence of coding patterns during a single exposure for an image frame. The parallax information, which is related to the differences in coding patterns, is recorded as events. The image frame and events, all of which are measured in a single exposure, are jointly used to computationally reconstruct a light field. We also designed an algorithm pipeline for our method that is end-to-end trainable on the basis of deep optics and compatible with real camera hardware. We experimentally showed that our method can achieve more accurate reconstruction than several other imaging methods with a single exposure. We also developed a hardware prototype with the potential to complete the measurement on the camera within 22 msec and demonstrated that light fields from real 3-D scenes can be obtained with convincing visual quality.

Project Members

Shuji Habuchi (Graduate Student)

Keita Takahashi (Associate Professor)

Chihiro Tsutake (Assistant Professor)

Toshiaki Fujii (Professor)

Hajime Nagahara (Professor, Osaka University)


Shuji Habuchi, Keita Takahashi, Chihiro Tsutake, Toshiaki Fujii, Hajime Nagahara: "Time-Efficient Light-Field Acquisition Using Coded Aperture and Events", accepted to IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2024. [ arxiv version ]

Supplementary Materials

Our software (using Python + PyTorch) for the above paper is available. Please find the "ReadME.txt" file for the terms of use and usage. [Get our software ]