Streaming 360-degree videos has become increasingly popular due to the growth in demand for immersive media in recent years. Companies such as Youtube, Facebook, and Netflix already use 360-degree video streaming. In order to reduce the amount of data transmitted only the part of the video at which the user looks is streamed at high resolution and enabling this requires accurate viewport prediction. However, recent approaches to streaming 360-degree video do not characterize user profiles or have low viewport prediction accuracy when either historical data is unavailable for the user or when the user starts watching a new video. This paper proposes a novel approach to User Profile-Based Viewport Prediction Using Federated Learning (UVPFL) in 360-degree Real-Time Video Streaming. UVPFL profiles users based on their head movements for different categories of videos. For high viewport prediction accuracy of a new user or a user with no historical data, UVPFL bases its viewport prediction on the viewport of similar users. Testing UVPFL in 360-degree real-time video streaming has resulted in an accuracy of up to 86% for the first seven seconds of video play. UVPFL also achieved an average accuracy of up to 96% for the complete length of video play. UVPFL has outperformed three state-of-the-art available viewport prediction solutions by 1.12% to 64.9% for a 1 second prediction horizon.
Jinyu ChenXianzhuo LuoMiao HuDi WuYipeng Zhou
Fang-Yi ChaoCagri OzcinarAljoša Smolić
Xiaolan JiangSi‐Ahmed NaasYi-Han ChiangStephan SiggYusheng Ji
Xianglong FengViswanathan SwaminathanSheng Wei