HEAT

Hybrid Extended reAliTy

Started at: 01-06-2024
Ends on: 31-05-2027

Budget: € 6 993 207.50

Areas: Media Technologies Area

Description:

HEAT (Hybrid Extended reAliTy) is born to pave the way for the next-generation distributed experiences by addressing major challenges to make those experiences that up to now could only be in our imagination: being realistically immersed (holo-ported) within real captured omnidirectional and navigable hyper-realistic 3D spaces, feeling their atmosphere, and sharing these experiences with others, regardless of their location. HEAT has the aim to integrate immersive media technologies such as point cloud/holographic imaging, multi-sensorial media, Social Virtual/eXtended Reality (VR/XR) in a multi-user, feedback-enabled communication system to provide the construction of compelling context-aware and embodied experiences for innovative hybrid XR applications, where remote users can experience a real captured environment through immersive VR, while in presence users can visualise and interact with the holograms of remote users integrated into the real environment through holographic rendering. The system will facilitate the exploitation of agile (multi-sensory) 3D data acquisition techniques, enhancing performance while reducing technology costs. It will create a scalable communication pipeline embodying either encapsulation of different media from classical audio-video to multisensorial to holographic video or combinations thereof, providing means for efficient and scalable encoding, processing, storage, (real-time) streaming and rendering. The project will provide designed and fully tested scenarios in real-world environments for enhanced XR experiences: blended learning, a modern theatre, a music festival and an opera show. All pilot actions will ensure that GDPR and ethics are addressed for end-users (privacy and ethics by design methodology).

Within the project, the i2CAT Foundation serves as Technical Coordinator and leads Work package WP3 – Technological Components and Architectural Integration, which main objective is the design and development of the innovative technological components required for the realisation of the project and their integration in the proposed end-to-end architecture, from media capture/retrieval to media presentation, including a wide range of multi-modal content and interaction methods. Within WP3, i2CAT will also lead two tasks: T3.2, which will focus on service and media Orchestration, as well as on adaptive low-latency delivery, and T3.5, which will focus on iterative testing and integration activities. The key WP3 technological outcome by i2CAT will be an evolved holographic communications platform, which will enable a truly natural and rich interaction between remote users connected via hybrid XR modes (i.e.,  in-presence users interacting with the real environment connecting via AR/XR mode, and remote users connecting via VR mode and being immersed in virtual replicas of the real environment).

Within the project, i2CAT will also take part in other Work Packages. Specifically, the Catalan research centre will participate in Work package WP2 – Requirements, Scenarios, Platform Architecture and QoE Methodology by leading Task 2.2 – Platform Design and Architecture. Within this task, researchers will specify the overall platform architecture, with the required modularity to be adapted to the envisioned scenarios. This task will also encompass a thorough analysis to maximise interoperability with existing open standards, current network infrastructure, and off-the-shelf hardware and software solutions applicable and/or relevant to the extensible platform to be developed. The architecture specification will also include a comprehensive description of the hardware and software components, associated modules, and required interfaces.

i2CAT will also participate in Work package WP4 – Pilots, Content Production and Evaluation, playing a key role in the pilot planning and execution activities, which will adopt the technological contributions by i2CAT.

HEAT will consider four distinct pilot cases as examples of deploying the developed technologies in real scenarios. The pilots will be used to validate the proposed system’s applicability and effectiveness through test assessment.

  1. XREL Pilot: A lecture, that is, a classroom/laboratory environment including a teacher and laboratory equipment, will be captured using volumetric cameras, microphones, and multi-sensory acquisition devices. The captured environment (i.e., the live lecture) will be dynamically processed and delivered to the remote students via XR headsets and multi-sensory actuators (i.e., olfaction dispensers, haptic gloves/vests). The remote students will be captured at their site using available devices (e.g., 360°/plenoptic/RGB-D cameras/stereo microphones from smartphones or dedicated cameras), being their hologram teleported into the classroom that will be equipped with holographic displays to allow teachers and remote students to socially interact, sharing visual social cues with them.
  2. XR Theatre Pilot: In a cultural performance of modern theatre, the environment (i.e., scenography, stage and performers) will be captured with volumetric cameras, microphones and multi-sensory acquisition devices. Remote audiences will experience the environment together with the in-presence audience, with the use of XR headsets and multi-sensory actuators (i.e., olfaction dispensers, haptic gloves/vests), like being seated in a specific position upon their request. The remote audience will be captured at their site, using available devices (e.g., 360°/plenoptic cameras/stereo microphones from smartphones or dedicated cameras), being their hologram teleported in the theatre via 3D holographic displays and/or XR headsets.
  3. XR Blues Pilot: The performance of a blues/rock band in a club will be captured using volumetric cameras and multi-sensory acquisition devices. Sound will be dynamically captured from the stage mixer and synchronised with the visual part. The remote audience will experience the environment with the in-presence audience, using XR headsets and multi-sensory actuators (e.g., olfaction dispensers, haptic gloves/vests). The remote audience will be captured at their site using available devices (e.g., 360°/plenoptic cameras/stereo microphones from smartphones or dedicated cameras), and their hologram teleported into the club via 3D holographic displays.
  4. XR Opera Pilot: In an opera cultural performance, the opera scenography and performers will be captured with volumetric cameras, microphones, and multi-sensory acquisition devices. Remote audiences will experience the environment together with the in-presence audience, with the use of XR headsets and multi-sensory actuators (i.e., olfaction dispensers, haptic gloves/vests), like being seated in a specific position upon their request. The remote audience will be captured at their site using available devices (e.g., 360°/plenoptic cameras/stereo microphones from smartphones or dedicated RGB-D cameras), being their hologram teleported in the theatre and rendered via 3D holographic displays. In addition, multi-sensory feedback (e.g., haptic, heart bit) from remote users will be acquired, and the possibility of dynamically rendering to the performers via wearable devices will be investigated.

Estimated impact:

The project oversees the following wider impacts:

  • Scientific impact: Consolidation, integration and dissemination of technological solution, methods, standards and paradigms for the creation of hybrid Social XR experiences.
  • Economic impact: Open a new market for services and delivery of Social XR experiences to remote audiences.
  • Societal impact: Embracement of the new technologies in the field of XR media delivery, allowing audiences to experience realistic experiences of important cultural events and enables realistic social interactions among remote users with leisure and education purposes.

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union. Neither the European Union nor the granting authority can be held responsible for them.

Consortium

HEAT project has received funding from the European Commission programme Horizon Europe, under grant agreement number: 101135637