NDI has kickstarted its plan to enhance NDI metadata capabilities, unlocking more control in live production and broadcast workflows.
Newly updated documentation details how to use bidirectional XML metadata exchange between NDI senders and receivers to enable new workflows, remote control, and smarter automation for live events and broadcast environments.
The new release also introduces four new official standard metadata elements, developed in collaboration with brands in the NDI Ecosystem: support for broadcast standards CEA-708 & SCTE-104, with ToolsOnAir, the audio standard MIDI with LAMA, and the lighting control standard DMX, in partnership with SalrayWorks.
Traditionally, live events require separate teams to manage video, lighting, and audio, which can be complex and costly. This is particularly true for smaller organisations like churches or school sporting events. Using NDI-based setups to transport metadata enables seamless remote management of each component within a single, unified workflow. For instance, integrating NDI with lighting systems allows for real-time, remote adjustments directly from video preview screens, eliminating the need for a dedicated specialist. This transforms live production environments, empowering broadcasters to handle complex operations with ease with professional-grade automation.
These partnerships inspired the creation of a new initiative, the Metadata Lab. Any brands and developers that want to follow in the footsteps of companies like LAMA, can now submit their metadata proposal at ndi.video/metadata-lab. The NDI technical team will then review proposals and work together with creators to test and improve on the submission. When ready, NDI will publish it as an official NDI Metadata standard in the ever-growing documentation. As new metadata integrations expand across the ecosystem, broadcasters and production teams will have more access to flexibility and control, unlocking new possibilities for the future of live production.
“NDI metadata enables us to integrate camera tracking data within the same stream where audio and video are transported. This ensures that tracking data remains synchronized with the video frame, regardless of transmission delays,” said Markus Rainer, Software Architect at Vizrt. “At Vizrt, our PTZ cameras (pictured above) already embed tracking data directly, while other solutions utilize the NDI Advanced SDK to embed externally delivered tracking data before transmission. By centralizing metadata documentation, NDI provides a unified resource for managing metadata extensions. This allows us to seamlessly reference existing solutions, reducing redundancy.”