User-Adaptive Rotational Snap-Cutting for Streamed 360° Videos
Résumé
Designing editing cuts for cinematic Virtual Reality (VR) has been under active investigation. Recently, the connection has been made between cuts in VR and adaptive streaming logics for 360 • videos, with the introduction of rotational snap-cuts. Snap-cuts can benefit the user's experience both by improving the streamed quality in the FoV and ensuring the user sees important elements for the plot. However, snap-cuts should not be too frequent and may be avoided when not beneficial to the streamed quality. We formulate the dynamic decision problem of snap-change triggering as a model-free Reinforcement Learning. We express the optimum cut triggering decisions computed offline with dynamic programming and investigate possible gains in quality of experience compared to baselines. We design Imitation Learning-based dynamic triggering strategies, and show that only knowing the past user's motion and video content, it is possible to outperform the controls without and with all cuts.
Origine | Fichiers produits par l'(les) auteur(s) |
---|