Europe tests a musical metaverse: inside MUSMET’s EIC‑funded push for real‑time, immersive performance
- ›MUSMET is a four‑year EIC Pathfinder project to prototype real‑time, networked music performance and audience experiences.
- ›The consortium targets three breakthroughs across socio‑cognitive research, low‑latency tech platforms, and new concert formats.
- ›Funded with EUR 3 million awarded in 2024, the project runs from 2025 to 2029 under Horizon Europe, coordinated by the University of Trento.
- ›Live demonstrations have appeared on the Somnium Space Musical Metaverse platform, but large‑scale adoption faces technical, legal and market hurdles.
- ›Claims of a paradigm shift hinge on solving sub‑20 ms latency, robust synchronization, privacy‑preserving bio‑signal use, and clear business models.
World Creativity and Innovation Day 2026 meets a European musical metaverse experiment
Observed annually on 21 April, World Creativity and Innovation Day is designed to highlight creativity as a driver of cultural vitality, scientific progress and economic growth. In that context, MUSMET, an EIC Pathfinder project, positions itself as a European attempt to rethink how musicians and audiences create, connect and experience performance in shared virtual and mixed reality spaces. The promise is ambitious. The practical path to impact is still to be demonstrated.
What MUSMET is building
MUSMET, short for Musical Metaverse Made in Europe, is coordinated by the University of Trento and brings together an interdisciplinary group of European universities and SMEs. Named partners in public materials include KTH Royal Institute of Technology and Politecnico di Torino among others. The consortium aims to lay scientific and technological foundations for a new class of musical interfaces and immersive concert formats that go beyond today’s tools.
| Item | Detail | Source |
| Grant ID | 101184379 | CORDIS |
| Programme | Horizon Europe, EIC Pathfinder Open | CORDIS |
| EU contribution | EUR 3 000 000 | CORDIS |
| EC signature date | 13 December 2024 | CORDIS |
| Project duration | 1 February 2025 to 31 January 2029 | CORDIS |
| Coordinator | University of Trento, Italy | EIC article and CORDIS |
| Consortium size | 15 organisations in 9 countries | MUSMET website |
| Policy trackers | Digital 100%, AI 40%, Climate 0% | CORDIS |
| Keywords | music, metaverse, XR, networking, embedded systems, machine learning | CORDIS |
The three claimed breakthroughs
1. Socio-cognitive strand
MUSMET plans to map the needs and concerns of contemporary musicians and audiences using collaborative design and neuro-physiological measurements. In practice, this often includes methods like eye tracking, galvanic skin response, heart rate variability or EEG to infer engagement, cognitive load or synchrony between performers and audiences. These approaches can expose how immersion, spatial audio or haptic cues influence musical interaction. The upside is better interface design grounded in evidence. The risk is over-interpreting noisy bio-signals or gathering sensitive data without clear, minimal and privacy-preserving protocols.
2. Technological strand
The consortium is developing concert platforms and devices that exchange information over ultra-reliable, low-latency wireless networks with strong privacy and security. Achieving this typically involves 5G URLLC profiles or carefully engineered Wi-Fi 6 and 7 links, edge computing to keep processing near users, and precise time synchronization. It also requires jitter control and robust packet loss concealment to avoid artifacts that break musical timing. Security and privacy constraints add design pressure, particularly when bio-sensing is involved or when performances cross jurisdictional boundaries.
| Use case | Latency budget | Notes |
| Co-located acoustic ensemble | Near-zero network latency | Physical acoustics dominate timing |
| Remote duo or small ensemble | < 20–30 ms round-trip audio | Higher requires adaptation or tempo drift tolerance |
| XR motion-to-photon | < 10–20 ms | To reduce motion sickness and preserve immersion |
| Typical home internet RTT | 40–80 ms or higher | Varies with ISP, routing, congestion |
| Pro remote rehearsal with optimized links | 20–30 ms achievable | Requires edge servers and tuned routes |
3. Musical formats strand
Insights from the socio-cognitive and technical work inform new concert formats designed for immersive, distributed environments. That includes audience co-creation elements, spatialized sound stages and performer presence across multiple locations. Success depends not just on engineering but on convincing artists and venues that these formats add artistic value rather than complexity.
Demonstrations so far
Public demonstrations have included live performances on the Somnium Space Musical Metaverse platform, with recordings available on the project’s channel. These demos show early-stage feasibility and offer a taste of interactive, spatialized performance. They also highlight a dependency on third-party platforms. MUSMET’s stated aim is to build scientific and technological building blocks, not to ship a consumer product. That distinction matters when judging near-term impact on the European creative sector.
Why this matters beyond music
If MUSMET can demonstrate dependable low-latency, privacy-aware collaboration at scale, the same stack could support other time-sensitive co-creation tasks such as distributed design sprints, remote rehearsals in theatre and dance, live sports analysis overlays or collaborative prototyping in engineering and education. That aligns with the Commission’s broader interest in Web 4.0 and virtual worlds. The caveat is that Europe’s XR and metaverse market remains fragmented, with inconsistent device support, uneven broadband quality and a lack of dominant European platforms. Translating lab breakthroughs into resilient, interoperable products will require standards work and industry uptake well beyond a single project.
Risks, open questions and how to judge progress
Technical bottlenecks
Achieving sub-20 ms round-trip audio along with sub-20 ms motion-to-photon visuals for multiple distributed participants is difficult over commodity networks. Even with 5G or tuned Wi-Fi, real-world jitter, last-mile variability and cross-border routing can break performance. Time synchronization across devices, consistent 3D audio rendering, and device heterogeneity add further complexity.
Human and cultural factors
The value proposition for musicians and audiences is not guaranteed. Artists already navigate complex digital ecosystems with uneven monetization. New formats must respect rehearsal workflows, offer robust technical support and ensure accessibility for audiences with varying bandwidth, devices and abilities.
Privacy and data protection
The project references strong privacy and security. That is essential if neuro-physiological measurements are collected, since such data can be sensitive or even biometric. Compliance with GDPR, purpose limitation, data minimization and on-device or edge processing will be crucial. Independent ethics oversight and transparent consent models are necessary to maintain trust.
Business model and sustainability
Metaverse-era concerts need clear revenue sharing among artists, platforms and rights holders. Energy-intensive XR rendering and network use also raise sustainability questions. Notably, the project’s policy trackers on CORDIS list 0% for climate action and biodiversity, which suggests environmental impact is not a core focus. Future scaling should factor in energy efficiency and greener delivery paths.
Standards and interoperability
Lasting impact will depend on contributions to open standards and interoperable formats. Relevant layers range from WebRTC and RTP for media transport to PTP or gPTP for timing, OpenXR for device interfaces, MIDI 2.0 for musical control data, and emerging spatial audio interchange formats. Without open interfaces, the risk is isolated demos that do not translate across platforms or devices.
How MUSMET fits into the EU innovation landscape
Europe has signaled interest in creative and cultural industries as economic drivers while also promoting a human-centric digital transition. MUSMET’s framing aligns with that agenda by combining AI, XR and networking with participatory design. The EIC’s support indicates willingness to explore risky ideas that may inform future EU positions on virtual worlds, digital sovereignty and cultural access. Whether it moves the needle will depend on rigorous open deliverables, documented performance gains and visible adoption by artists and venues beyond the consortium.
Timeline and what to expect next
The grant was signed in December 2024, with activities starting in February 2025 and running through January 2029. According to project channels, MUSMET is engaging the community through technical workshops co-located with IEEE events, public concerts and channelled videos, and will publish public deliverables, software and datasets as work matures. Stakeholders should look for quantifiable latency targets met in the field, demonstrable privacy-by-design approaches to bio-sensing and concrete contributions to standards or open tooling.
| Period | Indicative milestones | What to watch |
| 2025 | Consortium ramp-up, initial pilots and measurement protocols | Ethics frameworks, early latency benchmarks, open repos |
| 2026 | Live demos on third-party platforms and lab POCs | Stability under real audience load, device diversity support |
| 2027 | Prototype concert platforms and custom devices | Interoperability, contributions to standards bodies |
| 2028 | Larger trials with distributed performers and audiences | Independent evaluations, accessibility and inclusion metrics |
| 2029 | Final validation and dissemination | Transfer plans, follow-on funding, adoption by external artists |
Project at a glance and where to explore
Backed by €3 million in EIC Pathfinder funding awarded in 2024, MUSMET is a European-led effort to explore the future of digital creative experience. Curious what it looks and sounds like? The project points to live performances on Somnium Space’s Musical Metaverse and to recordings on its channel. For documentation and formal details, the CORDIS project page lists funding, timeline and trackers, while the MUSMET website hosts public deliverables, videos and event announcements.

