HbbTV: Multi-View and Multi-Device TV
( *Note: Youtube is the owner of the multimedia content used in this work. This is a four camera angle video from concert of singer Madilyn Bailey, recorded at a Youtube music night in Los Angeles, 2015)
This video shows a demo of an HbbTV-compatible testbed. It allows augmenting the broadcast TV content with additional and complementary content delivered via broadband technology. In this case, the director-controlled scenes of a show can be delivered via DVB-T and consumed on a connected TV, while additional views of the same show can be delivered via broadband (e.g., MPEG-DASH, HLS, RTP) and can be consumed either on the same main screen (PiP, mosaic...) or on companion devices (e.g., tablets, smartphones...). It can be seen that the media playout (across the involved devices) is accurately synchronized. This use case allows personalizing and enriching the TV watching experience.
HbbTV-compatible Multi-View & Multi-Device TV Scenario (see the note above):
SmartTV + Multi-touch
This video shows a demo of a prototype that allows the interaction between a Smart TV and a multi-touch screen. It allows the exchange of media files, as well as the extension and duplication of the screens.
Interactive Media Wall Projection
This video shows a demo of an interactive control system, which allows creating different windows of different sizes on free areas of a projection space (e.g., the wall of the living room), each of which can be running a different application (e.g., audio/video player, DVB player, web browser, photo/document viewer, notebook...). The system can be controlled via a mouse, a Wii remote and voice commands. It allows selecting the audio output card on which the sound of each window will be played out, so different users can consume different contents using their headphones, avoiding audio mixing issues.
Shared Media Consumption, Interaction and Collaboration
This video shows a demo of a web-based platform for shared media consumption, social interaction and remote collaboration between geographically distributed users.
By using this platform, users can create or join on-going sessions for concurrently consuming the same media content with other remote users in a synchronized manner. Besides, social interaction is provided by sharing the navigation control commands of the media player and by integrating text chat channels and a multi-party conferencing system (using WebRTC). The platforms also provides collaboration tools, such as the exchange of files, and a shared board (with transparent background) over the video window. Additionally, two social presence mechanisms have been added to stimulate the participation of external users in on-going sessions.
End-to-End DASH Platform
We have developed an end-to-end Dynamic Adaptive Streaming over HTTP (DASH) platform (using open-source components) that includes and allows the configuration of all the required steps along the end-to-end media delivery chain, from the encoding, segmentation and storage of the media content at the server side, to the delivery and adaptive consumption of the media content at the client side.
This demo demo video shows the processes of encoding and segmentation of the DASH content, as well as the creation of the MPD, at the server side.
A key component of our platform is the DASH client, developed using the GStreamer framework. It includes a module with a novel adaptive algorithm for switching between the available representations (i.e., qualities) of the media content at the server side, based on the available bandwidth and on internal conditions and features of the client (such as the buffer occupancy level, the battery level, its charging state and the CPU load). It also includes different modules to simulate specific values for these parameters, and to visualize their values and the value of the selected quality in real-time.
This demo video shows the functionalities and capabilities of the client side of our end-to-end DASH platform.
These videos show demos of our platform for customization and synchronization of subtitles in multi-screen scenarios.
VR Multi-Player Games
This video list shows some examples of our developed VR multiplayer Games.
This video shows an example of a Motion Capture (MOCAP) session, in which the movements and gestures of a user are captured and mapped, in real-time, to an avatar in a virtual reality environment (e.g., a video-game).
This is just an example of a dataset that can be created using our MoCap System.
This video shows a demo of how an interactive application can be controlled via gestures by using a Microsoft Kinect device.