Multi-view
Multi-view lets you ingest and render multiple Dolby.io real-time video and audio streams simultaneously inside a browser or mobile native applications. Once rendered, you can switch seamlessly between streams, allowing you to control how you view the content. By giving viewers content control, broadcasters can enable real-time experiences and engagement that leave viewers wanting more.
To create a multi-view experience you must capture multiple video or audio feeds and then broadcast them as a multisource stream. Once broadcasting a multi-source stream you can view the stream using the Dolby.io Millicast viewer app, or by building your own multi-view application.
Multi-view with the Dolby.io Viewer
Once you have created a Multisource stream, you can open the stream viewer from the Dolby.io dashboard or by navigating to:
https://viewer.millicast.com?streamId=[YOUR_ACCOUNT_ID]/[YOUR_STREAM_NAME]
Once you join, the bottom right gear icon flashes a notification prompting you to enable multi-view. Enable it to begin viewing the streams.
Alternatively, you can force the viewer to open to multi-view by including the &multisource=true
flag on the URL:
https://viewer.millicast.com?streamId=[YOUR_ACCOUNT_ID]/[YOUR_STREAM_NAME]&multisource=true
Creating a Multi-view Web Application
Project setup
For this you only need only three files:
- This initial HTML file
index.html
:
<html>
<head>
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:300,400,500,700" type="text/css">
<script src="multi.js" module type="module"></script>
<style>
body {
background: #e2e1e0;
text-align: center;
margin: 0px;
padding: 0px;
color: #555;
font-family: 'Roboto';
}
video {
width: 100%;
height: 100%;
}
#remoteVideos {
display: grid;
gap: 1rem;
grid-template-columns: repeat(3, 1fr);
margin: 1rem;
}
</style>
</head>
<body>
<h1>Multiview Example</h1>
<div id="remoteVideos"></div>
</body>
</html>
- An empty JavaScript file called
multi.js
- The Millicast SDK
js
file; for simplicity, JavaScript framework is not used. This file can be downloaded here.
Your project structure should look like this:
Getting Started
Understanding the initial code
The code provided for the html
file contains a basic configuration:
- It contains some basic CSS that will make the videos display on a grid.
- It imports the
multi.js
file, in which are used in the following sections. - It has a basic body with a
<div id="remoteVideos"></div>
, where the multiple videos are going to be shown.
For the actual logic of the app, the empty multi.js
file is used.
How Multisource works
Using the streaming service, a publisher can publish multiple sources under the same token and stream name (learn more here), as well as from the viewer app side to handle these multiple sources.
Whenever the publisher publishes a new source or stops a source, the signaling server emits an event through a WebSocket. This is all handled by the Millicast SDK and the SDK emits a broadcastEvent.
In order to watch multiple sources using the same Viewer instance, use the addRemoteTrack method provided by the SDK. This method creates a new transceiver with a MediaStream associated; the MediaStream contains the stream data. Using the project method with the newly created transceiver is how the MediaStream has stream data.
This app lets you create a View instance and connect to the stream and listen for the broadcastEvent
event.
Whenever an active
broadcast event is received, a new transceiver is created using addRemoteTrack
, the source is projected into the MediaStream
, and a video tag is created with the MediaStream
as a srcObject
.
Whenever an inactive
broadcast event is received, the source is unprojected and the video tag is deleted.
Creating a View instance and connecting to the stream
Inside multi.js
:
- Import the SDK and create the Viewer instance using the
accountId
andstreamName
, and subscribe to:
import { Director, View } from './millicast.esm.js';
// Config data
const accountId = "ACCOUNT_ID"
const streamName = "STREAM_NAME"
// Create a new viewer instance
const tokenGenerator = () => Director.getSubscriber(streamName, accountId)
const viewer = new View(streamName, tokenGenerator)
- Create two data structures, one
Set
for storing the sources received, and aMap
to map a source id to a transceiver. The data structures will be used later.
const sources = new Set()
// This will store a mapping: sourceId => transceiver media ids
const sourceIdTransceiversMap = new Map()
- Connect to the stream, subscribing to the
active
andinactive
events when the page is loaded:
document.addEventListener("DOMContentLoaded", async () => {
try {
await viewer.connect({
events: ['active', 'inactive']
});
} catch (e) {
console.log('Connection failed, handle error', e)
viewer.reconnect()
}
})
At this point, if you run the app using npx serve
you will not be able to see anything different. The viewer attempts to connect to the stream, but nothing will occur.
Listen for the active broadcast event
- Before the
document.addEventListener...
, listen to thebroadcastEvent
:
// Listen for broadcast events
viewer.on("broadcastEvent", (event) => {
// Get event name and data
const {name, data} = event
switch (name) {
case "active": {
// If the sourceId is undefined it means it's the main source
const sourceId = data.sourceId || "main";
// We store the source id in our sources Set
sources.add(sourceId)
// We need to define this function, this will create a new transceiver and project the new source
addRemoteTrackAndProject(data.sourceId)
break;
}
}
})
- Define the
addRemoteTrackAndProject
function:
const addRemoteTrackAndProject = async (sourceId) => {
// Create Media stream and create transceivers
const mediaStream = new MediaStream()
const videoTransceiver = await viewer.addRemoteTrack("video", [mediaStream])
// Optionally we can also add audio
const audioTransceiver = await viewer.addRemoteTrack("audio", [mediaStream])
// Add sourceId -> transceiver pair to the Map
sourceIdTransceiversMap.set(sourceId || "main", { videoMediaId: videoTransceiver.mid , audioMediaId: audioTransceiver.mid })
// We need to define this function, this function will render a new video tag into the html using the mediaStream as a srcObject
createVideoElement(mediaStream, sourceId)
// Finally we project the new source into the transceivers
await viewer.project(sourceId, [{
trackId: "video",
mediaId: videoTransceiver.mid,
media: "video"
}, // Optionally we also project audio
{
trackId: "audio",
mediaId: audioTransceiver.mid,
media: "audio"
}])
}
- Implement the
createVideoElement
function:
const createVideoElement = (mediaStream, sourceId) => {
const video = document.createElement("video")
// remoteVideos is already created in the HTML
const remoteVideos = document.getElementById('remoteVideos')
video.id = sourceId || "main"
video.srcObject = mediaStream
video.autoplay = true
// We mute the video so autoplay always work, this can be removed (https://developer.chrome.com/blog/autoplay/#new-behaviors)
video.muted = true
remoteVideos.appendChild(video)
}
When the app is running, the active sources are displayed. When a new source is published, a new video appears on the screen.
However, when a source is stopped, the video does not disappear. You must implement the inactive
case of the broadcastEvent
listener previously created.
Listen for the inactive broadcast event
You must remove the videos whenever a source stops by implementing the inactive
case:
- In the
broadcastEvent
listener that was created (viewer.on("broadcastEvent"...
), handle theinactive
case:
//...
switch (name) {
case "active": {
//...
}
case "inactive": {
const sourceId = data.sourceId || "main"
// Delete the source id from the sources Set
sources.delete(sourceId)
// We need to define this function, this function will unproject the source and remove the video tag associated
unprojectAndRemoveVideo(sourceId)
break;
}
}
- Implement the
unprojectAndRemoveVideo
function:
const unprojectAndRemoveVideo = async (sourceId) => {
// We get the transceivers associated with the source id
const sourceTransceivers = sourceIdTransceiversMap.get(sourceId)
// We unproject the sources of the transceivers
await viewer.unproject([sourceTransceivers.videoMediaId , sourceTransceivers.audioMediaId])
// Delete the video from the DOM
const video = document.getElementById(sourceId)
document.getElementById("remoteVideos").removeChild(video)
}
Final Result
Now you have a fully working multi-view app using the Milllicast SDK. Since only JavaScript was used, you can use this example in any of your favorite JavaScript frameworks.
- The full code can be found here
Limitations of Multi-view
Dolby.io Real-time Streaming does not limit the number of tracks that a viewer can receive, however, it limits the aggregate bitrate of all tracks to 12 Mbps. The pinned source is prioritized and allowed to exceed the 12 Mbps limit, and the other tracks share any remaining available bandwidth. The source with a null sourceId
is pinned by default. You can change the pinned source by using the pinnedSourceId
attribute in the View.connect
command. You should configure the Simulcast/SVC bitrate of each source so that a viewer can receive the desired amount of video tracks in the viewer session while remaining under the aggregate bitrate limit.
Example | Bandwidth Allocation |
---|---|
A 4 Mbps pinned track and four simulcast tracks | 4 Mbps is allocated to the pinned track and the other simulcast tracks receive 2 Mbps each. |
A 4 Mbps pinned track and two 2 Mbps tracks | The overall bitrate is is under the 12 Mbps limit. |
A 12 Mbps pinned track and four simulcast tracks | 12 Mbps is allocated to the pinned track and other tracks receive no bandwidth |
A 10 Mbps pinned track and two 2 Mbps tracks | 10 Mbps is allocated to the pinned track and there is only space for one additional track |
Updated 3 months ago