Title : Real-time transmission of multimedia contents for their visualization in an immersive environment system



Project Lead : Marco Pappalardo From : Software Engineering Italia Srl (Italy)

Dates : from 2014-04-11 23:30:44 to 2014-11-28 15:03:00

Description :

Motivation and objectives :
The project will study, implement and test a solution for a highly immersive real-time video conferencing systems using advanced visualization infrastructures provided by i2CAT in collaboration with the MOVING group from the Universitat Politcnica de Catalunya and Software Engineering Italia Srl (in the following Swing:It) which leads the VESPA project (see www.progettovespa.it) aimed to create a Virtual Environment for Neuro-Psichiatry through a Virtual Room based on a CAVE system and is investigating the best solutions available for visualization and video-conferencing on CAVE. The TNA project has been structured in order to find a viable solution for the users, allowing them to capture, transmit, synchronize and encode HD video using a CAVE system as visualization device. The initial use case proposed for this TNA is a basic scenario and mainly based on the demonstration of: - Visualization of multiple real time high quality videos in CAVEs - Software for low-latency videoconferencing - Evaluate network requirements and performance - Interactivity between users (nodes) The final objective in this project will be to deliver a point-to-point (1:1) high-quality low-latency video conferencing system, with an innovative integration facilitating simultaneous actions in a single CAVE (i.e.: a doctor can be exploring a 3D model of a human body while discussing what is being seen with another doctor in a distant hospital; a patient is running a cognitive task through fully immersive virtual reality and a doctor/technician can provide online support). Later scenarios will aim, among others use cases, at advanced videoconferencing (VC) systems interconnecting multiple CAVEs spread across Europe. With this objective, this TNA proposes to merge technologies from the VESPA project, VISIONAIR and the UPCs MOVING group to get new functionalities, in this case, multi streaming of video in real time through a high immersive visualization system, with low latency HD video and optimization of the bandwidth usage. To reach this goal, this TNA will test the network, the capacity to send and receive compressed video (ensuring interactivity), to send multiple video channels and to visualize the resulting decompressed images in the UPCs CAVE system. The final result should allow combining the local video stream visualization with an external stream used for videoconferencing in real time. The project envisages three main tasks, which will create a common framework for the work foreseen: global architecture design; implementation and integration of the different developments, testing and validation of the overall system. All this tasks, described below in the work plan, will be a sum of individual and joint actions among the three partners. The results obtained will be a great step forward to get closer to a final solution and will give an innovative perspective of the use of the technologies chosen. In fact, this will set up a new grounding for immersive visualization technologies. For all these reasons, this collaboration (possible thanks to the VISIONAIR project) will face challenges that considering present-day network capabilities and anticipating the features that future networks will provide, are relevant and of potential impact for several industrial and societal agents.

Teams :
The consortium establishing the VESPA project involves University of Catania (IT) and IRCCS Oasi (Clinical) and three IT SMEs (Software Engineering Italia, Korec and Xenia progetti).

Dates :
starting date : 02 July, 2014
ending date : 17 July, 2014

Facilities descriptions :
http://visionair-browser.g-scop.grenoble-inp.fr/visionair/Browser/Catalogs/DML.SP.html

Recordings & Results :
Summary (Publishable)*: The project studied, implemented and tested a solution for a highly immersive real-time video conferencing systems using advanced visualization infrastructures provided by i2CAT in collaboration with the MOVING group from the Universitat Politcnica de Catalunya and Software Engineering Italia Srl (in the following Swing:It) which leads the VESPA project (see www.progettovespa.it) aimed to create a Virtual Environment for Neuro-Psichiatry through a Virtual Room based on a CAVE system. The TNA project has been structured in order to find a viable solution for the users, allowing them to capture, transmit, synchronize and encode HD video using a CAVE system as visualization device

Conclusions :
Three main research and development steps took part: - Analyze different and possible implementations for UltraGrid integration with MOVING framework. Taking into account that UltraGrid may carry out whole videoconference system from capture, encoding, transmit, receive, decode and display frames. This last action should be integrated with the MOVING framework in order to display received and decoded frames (to be treated as RGB textures) inside the VR environment (CAVE). - Study and test different data (frames) sharing technologies between processes, taking into account synchronization (read and write same data buffer between both processes), concurrency (assure data consistency) and performance (assure working at incoming frame rate). - Finally decide which technology should be selected to implement the UltraGrid display interface for MOVING framework integration. The analyzed technologies for creating an IPC (Inter-Process Communication) environment were: o Named pipes: discarded due to not achieve robust inter-processes communication and development flexibility. o Memory sharing: selected due to reach the possibility to create a communication protocol required to assure data consistency and incoming frame rate rythmrhythm. To assure as much compatibility as possible, a simple class to work with and to port to any other framework/application was created with the aim to load and display the received frames from UltraGrid to any external process. This is using the POSIX (POSIX: XSI Extension) standardized API for using shared memory (UNIX and Windows support). Deployment and testing inside the current CAVE infrastructure was not possible due to not having compatible deployment environments during this step of the TNA and due to not having enough time to solve this issue because this require re-deploying and testing the whole CAVE infrastructure. So, this final test was proposed to be done out of the TNA scope in order to demonstrate in a real use case. Finally, testing was carried out with success by simulating the environment through the main test implemented in the sharing memory class. This test shows the first bytes of the read piece of shared memory and shows the incoming data/frame rate rhythm. So, specific test of capturing from a webcam (UltraGrid V4L2 capturer), encoding frames, transmitting, receiving, decoding frames and displaying through the implemented display (virtual display which implements the sharing memory class) for sharing frames between processes was tested and successful passed.




Project Images :

Fig7.jpg




.



Visionair logo

VISIONAIR / Grenoble INP / 46 avenue Felix Viallet / F-38 031 Grenoble cedex 1 / FRANCE
Project funded by the European Commission under grant agreement 262044