Omnidirectional Video Streaming Using Visual Attention-Driven Dynamic Tiling for VR

Abstract

This paper proposes a new adaptive omnidirectional video (ODV) streaming system that uses visual attention (VA) maps. The proposed method benefits from a novel approach to VA-based bitrate allocation algorithm and dynamic tiling, providing enhanced virtual reality (VR) video experiences. The main contribution of this paper is the use of VA maps: (i) to distribute a given bitrate budget among a set of tiles of a given ODV and, (ii) to decide an optimal tiling structure (i.e., tile scheme) per chunk. For this, a novel objective metric is proposed: the visual attention spherical weighted (VASW) PSNR. This metric operates in the spherical domain and by means of a VA probabilistic model aims at capturing the quality of the actual areas observed by the users when navigating through the ODV content. We evaluate the proposed system performance with varying bandwidth conditions and the tracked head orientations from disjoint user experiments. Results show that the proposed system significantly outperforms the existing tiled-based streaming method.

Publication
In 2018 IEEE Visual Communications and Image Processing (VCIP)