Multisource Playback
Dolby.io supports ingesting Multisource Streams and rendering multiple audio and video streams for building Multi-view and Audio Multiplexing experiences.
To get started building multi-stream experiences it's important to understand how Dolby.io handles multisource playback. In this guide we'll outline:
- How to manage source selection
- How to project feeds
- Media layer forwarding
- How to dynamically manage viewer tracks
Managing Source Selection
Multisource Broadcasting
To manage multiple sources, you first must have a Multisource Stream broadcasting.
Dolby.io streaming supports scalable WebRTC streaming thanks to a "smart" cascading node system that manages the peer-to-peer connections. These nodes are key to understanding how to manage multiple sources for playback and can be divided into two types:
- Publisher Nodes: These nodes manage the ingest of multiple sources during the broadcast. They can then forward these feeds to the CDN for the Viewer node to manage.
- Viewer Nodes: Viewer nodes are created depending on the quantity and location of viewers, allowing Dolby.io to support large-scale global streams. When rendering streams in your app or platform, you can communicate with the viewer node to negotiate what feeds to project and simulcast layers to receive.
When the Publisher node has a feed ready to be passed to a Viewer node, it triggers a broadcastEvent
. This event can be listened to by taking the millicast.View
object and adding an event listener to it:
const tokenGenerator = () => millicast.Director.getSubscriber({
streamName: streamName,
streamAccountId: streamAccountId,
});
viewer = new millicast.View(streamName, tokenGenerator);
viewer.on("broadcastEvent", async (event) => {
console.log("broadcastEvent", event);
}
A broadcastEvent
triggers each time a feed is added to the multisource broadcast. Hence, several broadcast events could each trigger at the start of a stream or over the course of the stream as feeds are added or removed. As outlined in Multisource Streams in the Broadcast guides, each stream must be distinguished by a unique source ID. As a broadcastEvent
triggers, you can manage which broadcasts to render for the end user by their feed's source ID.
Here is an example of an active
broadcastEvent
event, note the sourceId
:
{
"name": "active",
"data": {
"streamId": "accoundId/streamName",
"sourceId": "uniqueSourceID",
"tracks": [{
"trackId": "audio0",
"media": "audio"
}
{
"trackId": "video0",
"media": "video"
}
]
}
}
An active broadcastEvent
includes information relating to the sourceId
of the stream, as well as the trackID
and media
type, which are all used to establish a connection for playback. These values can be stored by appending them to an array
or adding them to other useful data structures such as map
.
With the feed data stored, we can now learn how to project the feeds so the viewer can see them.
Project Feeds
Once a feed has been published to the stream, you can project it using the viewer. The viewer.project
function allows you to map a feed onto the track, signaling to the CDN you are ready to receive data via a peer connection. Once a feed is mapped to a track, it can be rendered natively or in a browser.
viewer.project("uniqueSourceID",[
{
trackId: "audio0",
mediaId: audioTransceiver.mid,
media: "audio"
},
{
trackId: "video0",
mediaId: videoTransceiver.mid,
media: "video"
}
]);
The project function allows you to project only audio, only video, or both, each for multiple published sources. These published sources can be projected as they are published (by triggering broadcastEvent
) or all at once when they've all arrived.
The viewer
also supports an unproject
function. unproject
signals to the CDN that you want to stop receiving media from that source.
viewer.unproject([videoTransceiver.mid])
Media Layer Forwarding
Simulcast or SVC layer forwarding
By default, the Dolby.io Real-time Streaming server will choose which is the best Simulcast or SVC layer to forward to the viewer based on the bandwidth estimation calculated by the server.
In addition to selecting the origin source for the media, it is also possible to choose the specific Simulcast and/or SVC layer for each video track delivered by the Dolby.io Real-time Streaming server. You can do that either by specifying the layer
attribute on the project
command or using the select
command for the main video track:
viewer.project("mysource",[
{
trackId: "video0",
mediaId: videoTransceiver.mid
layer:
{
encodingId : "L",
temporalLayerId : 1
}
}
]);
The layer information available for each video source is provided periodically by the layers
event as shown above. If you want to switch back to the automatic layer selection, you would just need to send a project
or select
command with empty layer information.
Track limits for viewer
Dolby.io Real-time Streaming will not limit the number of tracks that a viewer is able to receive, but will limit the maximum bitrate per viewer to a maximum of 12 Mbps across all media tracks. You should configure the Simulcast/SVC bitrate of all the sources carefully within your Applications so they can receive the desired amount of video tracks in the viewer session.
Dynamic Viewer Track
The addRemoteTrack
method on our Javascript SDK provides the ability to add new tracks on demand on the viewer side. This method will perform a local renegotiation and create the track
event with the added track and transceiver.
// Add remote track and wait until the SDP O/A is performed and mid is assigned to the transceiver
const transceiver = await viewer.addRemoteTrack("video",[new MediaStream()]);
// Get mid for new created remote track
const mediaId = transceiver.mid;
After the renegotiation is done, you can use the newly created transceiver mid
attribute on the projection or layer selection methods to receive media from any source on the new track.
Updated 1 day ago