Streaming And Presentation Architectures For Extended Video Streams
Résumé
Mobile devices able to capture multimedia streams (audio and video) have now the capability to produce numerous associated data (GPS, Gyroscope, etc.). The real-time exploitation of these extra-data, in different application scenarios, raise the issue of integrating these data in existing broadcasting architectures. These architectures are designed to work with the audiovisual data and should be extended to match the extradata requirements (processing time, bandwidth, visualization techniques etc.), and be able to adapt to their environment (network, device) accordingly. The major part of the applications utilizing these data are developed in an "ad-hoc" way, without taking all of their characteristics in account, and require complex maintenance and update efforts in case new data types or new network/device support is introduced. Our study is focused on optimizing the present distribution and presentation architectures for Audiovisual (AV) content, in order to find an efficient way to represent the associated data.