Foreground-Aware Dense Depth Estimation for 360 Images

Qi Feng, Hubert P. H. Shum and Shigeo Morishima
Journal of WSCG - Proceedings of the 2020 International Conferences in Central Europe on Computer Graphics, Visualization and Computer Vision (WSCG), 2020

Foreground-Aware Dense Depth Estimation for 360 Images


With 360 imaging devices becoming widely accessible, omnidirectional content has gained popularity in multiple fields. The ability to estimate depth from a single omnidirectional image can benefit applications such as robotics navigation and virtual reality. However, existing depth estimation approaches produce sub-optimal results on real-world omnidirectional images with dynamic foreground objects. On the one hand, capture-based methods cannot obtain the foreground due to the limitations of the scanning and stitching schemes. On the other hand, it is challenging for synthesis-based methods to generate highly-realistic virtual foreground objects that are comparable to the real-world ones. In this paper, we propose to augment datasets with realistic foreground objects using an image-based approach, which produces a foreground-aware photorealistic dataset for machine learning algorithms. By exploiting a novel scale-invariant RGB-D correspondence in the spherical domain, we repurpose abundant non-omnidirectional datasets to include realistic foreground objects with correct distortions. We further propose a novel auxiliary deep neural network to estimate both the depth of the omnidirectional images and the mask of the foreground objects, where the two tasks facilitate each other. A new local depth loss considers small regions of interests and ensures that their depth estimations are not smoothed out during the global gradient’s optimization. We demonstrate the system using human as the foreground due to its complexity and contextual importance, while the framework can be generalized to any other foreground objects. Experimental results demonstrate more consistent global estimations and more accurate local estimations compared with state-of-the-arts.





 author={Feng, Qi and Shum, Hubert P. H. and Morishima, Shigeo},
 journal={Journal of WSCG},
 series={WSCG '20},
 title={Foreground-Aware Dense Depth Estimation for 360 Images},
 location={Plzen, Czech Republic},


AU  - Feng, Qi
AU  - Shum, Hubert P. H.
AU  - Morishima, Shigeo
T2  - Journal of WSCG
TI  - Foreground-Aware Dense Depth Estimation for 360 Images
PY  - 2020
Y1  - 5 2020
VL  - 28
IS  - 1--2
SP  - 79
EP  - 88
DO  - 10.24132/JWSCG.2020.28.10
SN  - 1213-6972
ER  - 

Plain Text

Qi Feng, Hubert P. H. Shum and Shigeo Morishima, "Foreground-Aware Dense Depth Estimation for 360 Images," Journal of WSCG, vol. 28, no. 1--2, pp. 79-88, Plzen, Czech Republic, May 2020.

Supporting Grants

Similar Research

Qi Feng, Hubert P. H. Shum and Shigeo Morishima, "360 Depth Estimation in the Wild - The Depth360 Dataset and the SegFuse Network", Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2022
Qi Feng, Hubert P. H. Shum and Shigeo Morishima, "Bi-Projection Based Foreground-Aware Omnidirectional Depth Prediction", Proceedings of the 2021 Visual Computing (VC), 2021
Qi Feng, Hubert P. H. Shum and Shigeo Morishima, "Enhancing Perception and Immersion in Pre-Captured Environments through Learning-Based Eye Height Adaptation", Proceedings of the 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2023
Li Li, Khalid N. Ismail, Hubert P. H. Shum and Toby P. Breckon, "DurLAR: A High-fidelity 128-Channel LiDAR Dataset with Panoramic Ambientand Reflectivity Imagery for Multi-Modal Autonomous Driving Applications", Proceedings of the 2021 International Conference on 3D Vision (3DV), 2021



Last updated on 4 June 2024
RSS Feed