Create a Multi-view Web Application
Overview
This guide will help you create a basic web app that is able to subscribe to a stream and dynamically create/delete video tags when a new source is published/stopped under the subscribed stream.
Note: This guide will show a basic implementation. A more detailed explanation on how multisource streams work can be found here.
This app is created using JavaScript, HTML/CSS, and the Millicast SDK.
Limitations
Dolby.io Real-time Streaming will not limit the number of tracks that a viewer is able to receive, however it will limit the aggregate bitrate of all tracks to 12 Mbps. The pinned source will be given priority and is allowed to exceed the 12 Mbps limit, the other tracks will share any remaining available bandwidth. The source with a null sourceId is pinned by default. You can change the pinned source by using the pinnedSourceId
attribute in the View.connect
command. You should configure the Simulcast/SVC bitrate of each source, so that a viewer can receive the desired amount of video tracks in the viewer session while remaining under the aggregate bitrate limit.
Example | Bandwidth Allocation |
---|---|
A 4 Mbps pinned track and four simulcast tracks | 4 Mbps is allocated to the pinned track and the other simulcast tracks receive 2 Mbps each. |
A 4 Mbps pinned track and two 2 Mbps tracks | The overall bitrate is is under the 12 Mbps limit. |
A 12 Mbps pinned track and four simulcast tracks | 12 Mbps is allocated to the pinned track and other tracks receive no bandwidth |
A 10 Mbps pinned track and two 2 Mbps tracks | 10 Mbps is allocated to the pinned track and there is only space for one additional track |
Project setup
For this we will need only three files:
- This initial HTML file
index.html
:
<html>
<head>
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:300,400,500,700" type="text/css">
<script src="multi.js" module type="module"></script>
<style>
body {
background: #e2e1e0;
text-align: center;
margin: 0px;
padding: 0px;
color: #555;
font-family: 'Roboto';
}
video {
width: 100%;
height: 100%;
}
#remoteVideos {
display: grid;
gap: 1rem;
grid-template-columns: repeat(3, 1fr);
margin: 1rem;
}
</style>
</head>
<body>
<h1>Multiview Example</h1>
<div id="remoteVideos"></div>
</body>
</html>
- An empty JavaScript file called
multi.js
- The Millicast SDK
js
file; for simplicity we are not going to use any JavaScript framework. This file can be downloaded here.
Your project structure should look like this:
Getting Started
Understanding the initial code
The code we provide for the html
file contains a basic configuration:
- It contains some basic CSS that will make the videos display on a grid
- It imports the
multi.js
file, in which we are going to work in the following sections - It has a basic body with a
<div id="remoteVideos"></div>
, where the multiple videos are going to be shown
For the actual logic of the app we are going to use the empty multi.js
file.
How Multisource works
Using the streaming service a publisher can publish multiple sources under the same token and stream name (learn more here) and from the viewer app side we need to handle these multiple sources.
Whenever the publisher publishes a new source or stops a source, the signaling server emits an event through a WebSocket. This is all handled by the Millicast SDK and the SDK emits a broadcastEvent.
Lastly, in order to watch multiple sources using the same Viewer instance, we need to use a method provided by the SDK called addRemoteTrack. This method creates a new transceiver with a MediaStream associated; the MediaStream contains the stream data. Using the project method with the newly created transceiver is how the MediaStream will have stream data.
With this app we will:
- Create a View instance and connect to the stream
- Listen for the
broadcastEvent
event - Whenever we receive an
active
broadcast event we will create a new transceiver usingaddRemoteTrack
, project the source into theMediaStream
, and create a video tag with theMediaStream
as asrcObject
- Whenever we receive an
inactive
broadcast event we willunproject
the source and delete the video tag
Creating a View instance and connecting to the stream
Inside multi.js
:
- Import the SDK and create the Viewer instance using the accountId and streamName, and we are going to subscribe:
import { Director, View } from './millicast.esm.js';
// Config data
const accountId = "ACCOUNT_ID"
const streamName = "STREAM_NAME"
// Create a new viewer instance
const tokenGenerator = () => Director.getSubscriber(streamName, accountId)
const viewer = new View(streamName, tokenGenerator)
- Create two data structures, one
Set
for storing the sources we receive, and aMap
to map a source id to a transceiver. We will use these data structures later on:
const sources = new Set()
// This will store a mapping: sourceId => transceiver media ids
const sourceIdTransceiversMap = new Map()
- Connect to the stream, subscribing to the
active
andinactive
events when the page is loaded:
document.addEventListener("DOMContentLoaded", async () => {
try {
await viewer.connect({
events: ['active', 'inactive']
});
} catch (e) {
console.log('Connection failed, handle error', e)
viewer.reconnect()
}
})
For now, if you run the app using npx serve
you will not be able to see anything different. The only thing that happens is that the viewer will try to connect to the stream, but it won't show anything yet.
Listen for the active broadcast event
- Before the
document.addEventListener...
, listen to thebroadcastEvent
:
// Listen for broadcast events
viewer.on("broadcastEvent", (event) => {
// Get event name and data
const {name, data} = event
switch (name) {
case "active": {
// If the sourceId is undefined it means it's the main source
const sourceId = data.sourceId || "main";
// We store the source id in our sources Set
sources.add(sourceId)
// We need to define this function, this will create a new transceiver and project the new source
addRemoteTrackAndProject(data.sourceId)
break;
}
}
})
- Define the
addRemoteTrackAndProject
function:
const addRemoteTrackAndProject = async (sourceId) => {
// Create Media stream and create transceivers
const mediaStream = new MediaStream()
const videoTransceiver = await viewer.addRemoteTrack("video", [mediaStream])
// Optionally we can also add audio
const audioTransceiver = await viewer.addRemoteTrack("audio", [mediaStream])
// Add sourceId -> transceiver pair to the Map
sourceIdTransceiversMap.set(sourceId || "main", { videoMediaId: videoTransceiver.mid , audioMediaId: audioTransceiver.mid })
// We need to define this function, this function will render a new video tag into the html using the mediaStream as a srcObject
createVideoElement(mediaStream, sourceId)
// Finally we project the new source into the transceivers
await viewer.project(sourceId, [{
trackId: "video",
mediaId: videoTransceiver.mid,
media: "video"
}, // Optionally we also project audio
{
trackId: "audio",
mediaId: audioTransceiver.mid,
media: "audio"
}])
}
- Finally, implement the last function we need, called
createVideoElement
:
const createVideoElement = (mediaStream, sourceId) => {
const video = document.createElement("video")
// remoteVideos is already created in the HTML
const remoteVideos = document.getElementById('remoteVideos')
video.id = sourceId || "main"
video.srcObject = mediaStream
video.autoplay = true
// We mute the video so autoplay always work, this can be removed (https://developer.chrome.com/blog/autoplay/#new-behaviors)
video.muted = true
remoteVideos.appendChild(video)
}
Now if we run the app we can see the active sources, and whenever a new source is published a new video will appear on the screen!
However, whenever a source is stopped, the video doesn't disappear. We need to implement the inactive
case of the broadcastEvent
listener we created.
Listen for the inactive broadcast event
Lastly, we need to remove the videos whenever a source stops. We can do this by implementing the inactive
case:
- In the
broadcastEvent
listener that was created (viewer.on("broadcastEvent"...
), handle theinactive
case:
//...
switch (name) {
case "active": {
//...
}
case "inactive": {
const sourceId = data.sourceId || "main"
// Delete the source id from the sources Set
sources.delete(sourceId)
// We need to define this function, this function will unproject the source and remove the video tag associated
unprojectAndRemoveVideo(sourceId)
break;
}
}
- Implement the
unprojectAndRemoveVideo
function:
const unprojectAndRemoveVideo = async (sourceId) => {
// We get the transceivers associated with the source id
const sourceTransceivers = sourceIdTransceiversMap.get(sourceId)
// We unproject the sources of the transceivers
await viewer.unproject([sourceTransceivers.videoMediaId , sourceTransceivers.audioMediaId])
// Delete the video from the DOM
const video = document.getElementById(sourceId)
document.getElementById("remoteVideos").removeChild(video)
}
Final Result
Now we have a fully working multi-view app using the Milllicast SDK. Since we used only JavaScript, you can use this example in any of your favorite JavaScript frameworks.
- The full code can be found here
Updated 3 months ago