HoloMIT

A photorealistic holoportation solution

Technical challenge

Holoportation refers to representation of real users and objects as volumetric video in VR/AR applications used in the Metaverse.

A complete holoportation solution must provide the capture, the transmission and representation of the subject in the VR/AR environment. Our technology is a complete holoportation solution based on point cloud data format, giving a strong importance to natural content (in front of mesh data format) and our technology deals very successfully with some technical issues of such paradigm in terms of data handling, scalability and limited resources at the client side. 

 

Technical solution

This solution has been designed to enable a first generation of holoportation systems, paying special attention to enable a low cost capture process and compatibility with different ways to interact (HMD, glasses, or web applications).

The capture process is done with the combination of one or more volumetric cameras that provide RGB+Depth streams. After a calibration procedure, it combines the different images with world coordinates, allowing the fusing of images into a point clouds object prepared for holoportation. As volumetric captures improve and reduce its cost, our system is able to deal with it successfully and improve the quality.

For the transmission, the point cloud streams are compressed to save bandwidth, but we’ve improved this part to enable viewers to save addition bandwidth downloading only the relevant parts of a point cloud, avoiding invisible tiles (point-of-view awareness). Then each tile is downsampled into different Levels of Detail to enable the end-user a choice that maximizes the bandwidth (i.e. more detail for closer objects and less for the farthest ones).

Also, for multi-user holoportations, a multiconference unit concept has been developed as a cloud based component, that will reduce the number of streams in a system from Nx(N-1) to 2xN. Combining this feature with Level of Detail and Field of View will reduce the overall computational cost and bandwidth required and end-to-end latency.

To display the holoported subject, there’s a decompression and synchronization of the stream, achieving a consistent representation in terms of 3D model, size, position, orientation and timing in the VR/AR application.

 

Applications

The main applications are for VR/AR applications where a real person holoportation could make the difference. We’ve raised interests in corporate events, training and showcasing of products that are difficult to show in reality and real-time. 

i2CAT has successfully demonstrated the technology in several concept demonstrations at the conventions Mobile World Congress 2021 and Smart City Expo World Congress 2021 and Interihotel 2021.

 

Tech transfer opportunities

We are looking for licensees in the field of VR/AR:

  • metaverse owners that are thinking about a second generation of interaction moving from synthetic avatars to realistic avatars
  • metaverse owners that want to be prepared for the interaction of a high number of users and are willing to save computational resources and bandwith.
  • developers of applications that want to add interactivity (holoportation) to their scenarios

For more information, please, contact [email protected].

 

Media