WildLight reconstructs object geometry and reflectance from unstructured flashlight and non-flashlight images captured under unknown environment lights.
This paper proposes a practical photometric solution for the challenging problem of in-the-wild inverse rendering under unknown ambient lighting. Our system recovers scene geometry and reflectance using only multi-view images captured by a smartphone. The key idea is to exploit smartphone's built-in flashlight as a minimally controlled light source, and decompose image intensities into two photometric components -- a static appearance corresponds to ambient flux, plus a dynamic reflection induced by the moving flashlight. Our method does not require flash/non-flash images to be captured in pairs. Building on the success of neural light fields, we use an off-the-shelf method to capture the ambient reflections, while the flashlight component enables physically accurate photometric constraints to decouple reflectance and illumination. Compared to existing inverse rendering methods, our setup is applicable to non-darkroom environments yet sidesteps the inherent difficulties of explicit solving ambient reflections. We demonstrate by extensive experiments that our method is easy to implement, casual to set up, and consistently outperforms existing in-the-wild inverse rendering techniques. Finally, our neural reconstruction can be easily exported to PBR textured triangle mesh ready for industrial renderers.
Visual comparison of novel rendering and surface geometry versus ground truth.
Visualization of reconstruction quality for three real indoor scenes.
@article{cheng2023wildlight,
title = {WildLight: In-the-wild Inverse Rendering with a Flashlight},
author = {Cheng, Ziang and Li, Junxuan and Li, Hongdong},
journal= {arXiv preprint arXiv:2303.14190},
year = {2023}
}
}