R&I group

Media Technologies

The Media Technologies Research Group transforms cutting-edge academic research into interactive, immersive experiences, empowering users across sectors. We research volumetric capture systems towards the production of lifelike 3D representations of people and environments in real time, develop holographic communication platforms, and integrate extended reality (XR) features that blend physical and digital worlds. Our work also includes streaming high-quality 360º video to enable immersive storytelling and remote participation, and leveraging remote rendering technologies to seamlessly deliver complex interactive experiences through the web. Our multidisciplinary team of researchers, engineers, and designers collaborates with industry and academic partners to ensure innovations are validated through user-centric pilots, scientific publications, and successful technology transfer. By bridging the gap between theoretical advances and market-ready applications, MediaTech drives forward the adoption of next-generation media solutions that enhance collaboration, training, remote presence, and entertainment.

Research lines

  • Media Networks: This research line focuses on research and software development for synchronized, low-latency media delivery across diverse formats and platforms. With expertise in protocols like NTP, PTP, WebRTC, and MPEG-DASH, the team enhances network performance and real-time content processing using scalable cloud and edge computing infrastructures. Its innovations support live production and interactive events by enabling adaptive delivery architectures, XR session management, and advanced media synchronization. Key research areas include 6G Network-as-a-Service (NaaS) APIs, remote and cloud-based rendering, and media distribution technologies tailored for lightweight devices and variable network conditions.
  • Media Processing & Optimization: This research line pioneers advanced techniques for volumetric video compression and distribution, aimed at enabling real-time AR/VR experiences across a wide range of devices. Specializing in GPU-accelerated encoding, the team has developed real-time streaming solutions for dense point clouds and 3D Gaussian Splatting, significantly reducing bandwidth and storage requirements. These technologies are crucial for high-quality 3D content creation, digital twins, and volumetric streaming. Their work directly impacts sectors like industry, healthcare, education, and entertainment by making immersive content more efficient and accessible.
  • Media Capture & Reconstruction: This research line develops high-fidelity, multi-camera volumetric capture systems using RGB-D setups for real-time and offline human reconstruction. Their GPU-accelerated solutions enable seamless applications such as holoportation, immersive telepresence, and volumetric recordings. The team explores cutting-edge methods like Gaussian Splatting to enhance realism and scalability in human and environment reconstruction. These innovations support next-generation applications in immersive storytelling, virtual collaboration, and interactive experiences, advancing the capabilities of volumetric media in real-world scenarios.

Innovation lines

  • Smart city: Large Scale PC and GS compression for landscape visualization Technologies
  • Real-time multi-user holoportation
  • Immersive video
  • Volumetric video
  • Scalable communications
  • Point Cloud Compression
  • Gaussian Splatting Compression
  • Stream Synchronization
  • Multi-camera 3D reconstruction
  • Supersampling and denoising
  • Quality of Experience
  • Adaptive and low-latency streaming

Group leader

Sergi Fernandez

Media Technologies

Publications

Energy-aware Joint Orchestration of 5G and Robots: Experimental Testbed and Field Validation

M. Groshev, L. Zanzi, C. Delgado, X. Li, A. d. l. Oliva and X. Costa-Pérez, “Energy-Aware Joint Orchestration of 5G and Robots: Experimental Testbed and Field Validation,” in IEEE Transactions on Network and Service Management, vol. 22, no. 4, pp. 3046-3059, Aug. 2025, doi: 10.1109/TNSM.2025.3555126. keywords: {Robots;Robot kinematics;5G mobile communication;Robot sensing systems;Sensors;Resource management;Real-time systems;Energy consumption;Testing;Peer-to-peer computing;5G;orchestration;robotics;optimization;offloading;energy efficient},

Quantum Computing in the RAN with Qu4Fec: Closing Gaps Towards Quantum-based FEC processors

Nikolaos Apostolakis, Marta Sierra-Obea, Marco Gramaglia, Jose A. Ayala-Romero, Andres Garcia-Saavedra, Marco Fiore, Albert Banchs, and Xavier Costa-Perez. 2025. Quantum Computing in the RAN with Qu4Fec: Closing Gaps Towards Quantum-based FEC processors. Proc. ACM Meas. Anal. Comput. Syst. 9, 2, Article 36 (June 2025), 25 pages. https://doi.org/10.1145/3727128

Curved Apertures for Customized Wave Trajectories: Beyond Flat Aperture Limitations

J. M. Canals, F. Devoti, V. Sciancalepore, M. D. Renzo and X. Costa-Pérez, “Curved Apertures for Customized Wave Trajectories: Beyond Flat Aperture Limitations,” in IEEE Wireless Communications Letters (2025).

Media Technologies

Latest news