Europe tests a musical metaverse: inside MUSMET’s EIC‑funded push for real‑time, immersive performance

Brussels, April 21st 2026
Summary
  • MUSMET is a four‑year EIC Pathfinder project to prototype real‑time, networked music performance and audience experiences.
  • The consortium targets three breakthroughs across socio‑cognitive research, low‑latency tech platforms, and new concert formats.
  • Funded with EUR 3 million awarded in 2024, the project runs from 2025 to 2029 under Horizon Europe, coordinated by the University of Trento.
  • Live demonstrations have appeared on the Somnium Space Musical Metaverse platform, but large‑scale adoption faces technical, legal and market hurdles.
  • Claims of a paradigm shift hinge on solving sub‑20 ms latency, robust synchronization, privacy‑preserving bio‑signal use, and clear business models.

World Creativity and Innovation Day 2026 meets a European musical metaverse experiment

Observed annually on 21 April, World Creativity and Innovation Day is designed to highlight creativity as a driver of cultural vitality, scientific progress and economic growth. In that context, MUSMET, an EIC Pathfinder project, positions itself as a European attempt to rethink how musicians and audiences create, connect and experience performance in shared virtual and mixed reality spaces. The promise is ambitious. The practical path to impact is still to be demonstrated.

What MUSMET is building

MUSMET, short for Musical Metaverse Made in Europe, is coordinated by the University of Trento and brings together an interdisciplinary group of European universities and SMEs. Named partners in public materials include KTH Royal Institute of Technology and Politecnico di Torino among others. The consortium aims to lay scientific and technological foundations for a new class of musical interfaces and immersive concert formats that go beyond today’s tools.

MUSMET in brief:The project seeks to combine human-computer interaction, engineering, cognition and music to enable distributed performance and audience participation in virtual or mixed reality. It emphasizes real-time collaboration over ultra-reliable, low-latency wireless networks with strict privacy and security requirements, and it explores how these capabilities could generalize to other metaverse-style collaborative activities.
ItemDetailSource
Grant ID101184379CORDIS
ProgrammeHorizon Europe, EIC Pathfinder OpenCORDIS
EU contributionEUR 3 000 000CORDIS
EC signature date13 December 2024CORDIS
Project duration1 February 2025 to 31 January 2029CORDIS
CoordinatorUniversity of Trento, ItalyEIC article and CORDIS
Consortium size15 organisations in 9 countriesMUSMET website
Policy trackersDigital 100%, AI 40%, Climate 0%CORDIS
Keywordsmusic, metaverse, XR, networking, embedded systems, machine learningCORDIS

The three claimed breakthroughs

1. Socio-cognitive strand

MUSMET plans to map the needs and concerns of contemporary musicians and audiences using collaborative design and neuro-physiological measurements. In practice, this often includes methods like eye tracking, galvanic skin response, heart rate variability or EEG to infer engagement, cognitive load or synchrony between performers and audiences. These approaches can expose how immersion, spatial audio or haptic cues influence musical interaction. The upside is better interface design grounded in evidence. The risk is over-interpreting noisy bio-signals or gathering sensitive data without clear, minimal and privacy-preserving protocols.

2. Technological strand

The consortium is developing concert platforms and devices that exchange information over ultra-reliable, low-latency wireless networks with strong privacy and security. Achieving this typically involves 5G URLLC profiles or carefully engineered Wi-Fi 6 and 7 links, edge computing to keep processing near users, and precise time synchronization. It also requires jitter control and robust packet loss concealment to avoid artifacts that break musical timing. Security and privacy constraints add design pressure, particularly when bio-sensing is involved or when performances cross jurisdictional boundaries.

What counts as low latency in music:For synchronous ensemble performance, musicians generally perceive round-trip audio delays above roughly 20 to 30 milliseconds as disruptive, with single-digit millisecond latency considered ideal. XR adds visual motion-to-photon budgets that also need to stay below about 20 milliseconds to prevent discomfort. Hitting both consistently over variable public networks is technically challenging, which is why projects like MUSMET lean on controlled network paths, edge servers and custom codecs.
Use caseLatency budgetNotes
Co-located acoustic ensembleNear-zero network latencyPhysical acoustics dominate timing
Remote duo or small ensemble< 20–30 ms round-trip audioHigher requires adaptation or tempo drift tolerance
XR motion-to-photon< 10–20 msTo reduce motion sickness and preserve immersion
Typical home internet RTT40–80 ms or higherVaries with ISP, routing, congestion
Pro remote rehearsal with optimized links20–30 ms achievableRequires edge servers and tuned routes

3. Musical formats strand

Insights from the socio-cognitive and technical work inform new concert formats designed for immersive, distributed environments. That includes audience co-creation elements, spatialized sound stages and performer presence across multiple locations. Success depends not just on engineering but on convincing artists and venues that these formats add artistic value rather than complexity.

Demonstrations so far

Public demonstrations have included live performances on the Somnium Space Musical Metaverse platform, with recordings available on the project’s channel. These demos show early-stage feasibility and offer a taste of interactive, spatialized performance. They also highlight a dependency on third-party platforms. MUSMET’s stated aim is to build scientific and technological building blocks, not to ship a consumer product. That distinction matters when judging near-term impact on the European creative sector.

Somnium Space in context:Somnium Space is a commercial social VR environment used for events and performances. It offers accessible tooling but is not a public European research infrastructure. Building demonstrations there can speed iteration and audience testing, but longer-term objectives in Europe include interoperable, standards-based environments to avoid lock-in.

Why this matters beyond music

If MUSMET can demonstrate dependable low-latency, privacy-aware collaboration at scale, the same stack could support other time-sensitive co-creation tasks such as distributed design sprints, remote rehearsals in theatre and dance, live sports analysis overlays or collaborative prototyping in engineering and education. That aligns with the Commission’s broader interest in Web 4.0 and virtual worlds. The caveat is that Europe’s XR and metaverse market remains fragmented, with inconsistent device support, uneven broadband quality and a lack of dominant European platforms. Translating lab breakthroughs into resilient, interoperable products will require standards work and industry uptake well beyond a single project.

EIC Pathfinder explained:Pathfinder funds early-stage, high-risk research with breakthrough potential. Grants often support interdisciplinary teams through exploratory studies and first proof-of-concept prototypes. Many projects do not become products directly. Instead, they aim to de-risk ideas enough to attract follow-on funding, sometimes through the EIC Transition scheme or private investment.

Risks, open questions and how to judge progress

Technical bottlenecks

Achieving sub-20 ms round-trip audio along with sub-20 ms motion-to-photon visuals for multiple distributed participants is difficult over commodity networks. Even with 5G or tuned Wi-Fi, real-world jitter, last-mile variability and cross-border routing can break performance. Time synchronization across devices, consistent 3D audio rendering, and device heterogeneity add further complexity.

Human and cultural factors

The value proposition for musicians and audiences is not guaranteed. Artists already navigate complex digital ecosystems with uneven monetization. New formats must respect rehearsal workflows, offer robust technical support and ensure accessibility for audiences with varying bandwidth, devices and abilities.

Privacy and data protection

The project references strong privacy and security. That is essential if neuro-physiological measurements are collected, since such data can be sensitive or even biometric. Compliance with GDPR, purpose limitation, data minimization and on-device or edge processing will be crucial. Independent ethics oversight and transparent consent models are necessary to maintain trust.

Business model and sustainability

Metaverse-era concerts need clear revenue sharing among artists, platforms and rights holders. Energy-intensive XR rendering and network use also raise sustainability questions. Notably, the project’s policy trackers on CORDIS list 0% for climate action and biodiversity, which suggests environmental impact is not a core focus. Future scaling should factor in energy efficiency and greener delivery paths.

Standards and interoperability

Lasting impact will depend on contributions to open standards and interoperable formats. Relevant layers range from WebRTC and RTP for media transport to PTP or gPTP for timing, OpenXR for device interfaces, MIDI 2.0 for musical control data, and emerging spatial audio interchange formats. Without open interfaces, the risk is isolated demos that do not translate across platforms or devices.

How MUSMET fits into the EU innovation landscape

Europe has signaled interest in creative and cultural industries as economic drivers while also promoting a human-centric digital transition. MUSMET’s framing aligns with that agenda by combining AI, XR and networking with participatory design. The EIC’s support indicates willingness to explore risky ideas that may inform future EU positions on virtual worlds, digital sovereignty and cultural access. Whether it moves the needle will depend on rigorous open deliverables, documented performance gains and visible adoption by artists and venues beyond the consortium.

Timeline and what to expect next

The grant was signed in December 2024, with activities starting in February 2025 and running through January 2029. According to project channels, MUSMET is engaging the community through technical workshops co-located with IEEE events, public concerts and channelled videos, and will publish public deliverables, software and datasets as work matures. Stakeholders should look for quantifiable latency targets met in the field, demonstrable privacy-by-design approaches to bio-sensing and concrete contributions to standards or open tooling.

PeriodIndicative milestonesWhat to watch
2025Consortium ramp-up, initial pilots and measurement protocolsEthics frameworks, early latency benchmarks, open repos
2026Live demos on third-party platforms and lab POCsStability under real audience load, device diversity support
2027Prototype concert platforms and custom devicesInteroperability, contributions to standards bodies
2028Larger trials with distributed performers and audiencesIndependent evaluations, accessibility and inclusion metrics
2029Final validation and disseminationTransfer plans, follow-on funding, adoption by external artists

Project at a glance and where to explore

Backed by €3 million in EIC Pathfinder funding awarded in 2024, MUSMET is a European-led effort to explore the future of digital creative experience. Curious what it looks and sounds like? The project points to live performances on Somnium Space’s Musical Metaverse and to recordings on its channel. For documentation and formal details, the CORDIS project page lists funding, timeline and trackers, while the MUSMET website hosts public deliverables, videos and event announcements.