Multi-view lets you ingest and render multiple real-time video and audio streams simultaneously inside a browser or mobile native applications. Once rendered, you can switch seamlessly between streams, allowing you to control how you view the content. By giving viewers content control, broadcasters can enable real-time experiences and engagement that leave viewers wanting more.

To create a multi-view experience you must capture multiple video or audio feeds and then broadcast them as a multisource stream. Once broadcasting a multi-source stream you can view the stream using the Millicast viewer app, or by building your own multi-view application.

Multi-view with the Viewer

Once you have created a Multisource stream, you can open the stream viewer from the dashboard or by navigating to:[YOUR_ACCOUNT_ID]/[YOUR_STREAM_NAME]

Once you join, the bottom right gear icon flashes a notification prompting you to enable multi-view. Enable it to begin viewing the streams.

Alternatively, you can force the viewer to open to multi-view by including the &multisource=true flag on the URL:[YOUR_ACCOUNT_ID]/[YOUR_STREAM_NAME]&multisource=true

Creating a Multi-view Web Application

Project setup

For this you only need only three files:

  • This initial HTML file index.html:
    <link rel="stylesheet" href=",400,500,700" type="text/css">
    <script src="multi.js" module type="module"></script>
      body {
        background: #e2e1e0;
        text-align: center;
        margin: 0px;
        padding: 0px;
        color: #555;
        font-family: 'Roboto';
      video {
        width: 100%;
        height: 100%;
      #remoteVideos {
        display: grid;
        gap: 1rem;
        grid-template-columns: repeat(3, 1fr);
        margin: 1rem;
    <h1>Multiview Example</h1>
    <div id="remoteVideos"></div>
  • An empty JavaScript file called multi.js
  • The Millicast SDK js file; for simplicity, JavaScript framework is not used. This file can be downloaded here.

Your project structure should look like this:

Getting Started

Understanding the initial code

The code provided for the html file contains a basic configuration:

  • It contains some basic CSS that will make the videos display on a grid.
  • It imports the multi.js file, in which are used in the following sections.
  • It has a basic body with a <div id="remoteVideos"></div>, where the multiple videos are going to be shown.

For the actual logic of the app, the empty multi.js file is used.

How Multisource works

Using the streaming service, a publisher can publish multiple sources under the same token and stream name (learn more here), as well as from the viewer app side to handle these multiple sources.

Whenever the publisher publishes a new source or stops a source, the signaling server emits an event through a WebSocket. This is all handled by the Millicast SDK and the SDK emits a broadcastEvent.

In order to watch multiple sources using the same Viewer instance, use the addRemoteTrack method provided by the SDK. This method creates a new transceiver with a MediaStream associated; the MediaStream contains the stream data. Using the project method with the newly created transceiver is how the MediaStream has stream data.

This app lets you create a View instance and connect to the stream and listen for the broadcastEvent event.

Whenever an active broadcast event is received, a new transceiver is created using addRemoteTrack, the source is projected into the MediaStream, and a video tag is created with the MediaStream as a srcObject.

Whenever an inactive broadcast event is received, the source is unprojected and the video tag is deleted.

Creating a View instance and connecting to the stream

Inside multi.js:

  1. Import the SDK and create the Viewer instance using the accountId and streamName, and subscribe to:
import { Director, View } from './millicast.esm.js';
// Config data
const accountId     = "ACCOUNT_ID"
const streamName    = "STREAM_NAME"
// Create a new viewer instance
const tokenGenerator = () => Director.getSubscriber(streamName, accountId)
const viewer = new View(streamName, tokenGenerator)
  1. Create two data structures, one Set for storing the sources received, and a Map to map a source id to a transceiver. The data structures will be used later.
const sources = new Set()
// This will store a mapping: sourceId => transceiver media ids
const sourceIdTransceiversMap = new Map()
  1. Connect to the stream, subscribing to the active and inactive events when the page is loaded:
document.addEventListener("DOMContentLoaded", async () => {
    try {
        await viewer.connect({
         events: ['active', 'inactive']
    } catch (e) {
        console.log('Connection failed, handle error', e)

At this point, if you run the app using npx serve you will not be able to see anything different. The viewer attempts to connect to the stream, but nothing will occur.

Listen for the active broadcast event

  1. Before the document.addEventListener..., listen to the broadcastEvent:
// Listen for broadcast events
viewer.on("broadcastEvent", (event) => {
    // Get event name and data
    const {name, data} = event
    switch (name) {
        case "active": {
            // If the sourceId is undefined it means it's the main source
            const sourceId = data.sourceId || "main";
            // We store the source id in our sources Set
            // We need to define this function, this will create a new transceiver and project the new source
  1. Define the addRemoteTrackAndProject function:
const addRemoteTrackAndProject = async (sourceId) => { 
    // Create Media stream and create transceivers 
    const mediaStream = new MediaStream()
    const videoTransceiver = await viewer.addRemoteTrack("video", [mediaStream])
    // Optionally we can also add audio
    const audioTransceiver = await viewer.addRemoteTrack("audio", [mediaStream])
    // Add sourceId -> transceiver pair to the Map
    sourceIdTransceiversMap.set(sourceId || "main", { videoMediaId: videoTransceiver.mid , audioMediaId: audioTransceiver.mid })
    // We need to define this function, this function will render a new video tag into the html using the mediaStream as a srcObject
    createVideoElement(mediaStream, sourceId)
    // Finally we project the new source into the transceivers
    await viewer.project(sourceId, [{
      trackId: "video",
      mediaId: videoTransceiver.mid,
      media: "video"
    }, // Optionally we also project audio
      trackId: "audio",
      mediaId: audioTransceiver.mid,
      media: "audio"
  1. Implement thecreateVideoElementfunction:
const createVideoElement = (mediaStream, sourceId) => {
    const video = document.createElement("video")
    // remoteVideos is already created in the HTML
    const remoteVideos = document.getElementById('remoteVideos') = sourceId || "main"
    video.srcObject = mediaStream
    video.autoplay = true
    // We mute the video so autoplay always work, this can be removed (
    video.muted = true

When the app is running, the active sources are displayed. When a new source is published, a new video appears on the screen.

However, when a source is stopped, the video does not disappear. You must implement the inactive case of the broadcastEvent listener previously created.

Listen for the inactive broadcast event

You must remove the videos whenever a source stops by implementing the inactive case:

  1. In the broadcastEvent listener that was created (viewer.on("broadcastEvent"...), handle the inactive case:
switch (name) {
    case "active": {
    case "inactive": {
        const sourceId = data.sourceId || "main"
        // Delete the source id from the sources Set
        // We need to define this function, this function will unproject the source and remove the video tag associated
  1. Implement the unprojectAndRemoveVideo function:
const unprojectAndRemoveVideo = async (sourceId) => {
    // We get the transceivers associated with the source id
    const sourceTransceivers = sourceIdTransceiversMap.get(sourceId)
    // We unproject the sources of the transceivers
    await viewer.unproject([sourceTransceivers.videoMediaId , sourceTransceivers.audioMediaId])
    // Delete the video from the DOM
    const video = document.getElementById(sourceId)

Final Result

Now you have a fully working multi-view app using the Milllicast SDK. Since only JavaScript was used, you can use this example in any of your favorite JavaScript frameworks.

  • The full code can be found here

Limitations of Multi-view Real-time Streaming does not limit the number of tracks that a viewer can receive, however, it limits the aggregate bitrate of all tracks to 12 Mbps. The pinned source is prioritized and allowed to exceed the 12 Mbps limit, and the other tracks share any remaining available bandwidth. The source with a null sourceId is pinned by default. You can change the pinned source by using the pinnedSourceId attribute in the View.connect command. You should configure the Simulcast/SVC bitrate of each source so that a viewer can receive the desired amount of video tracks in the viewer session while remaining under the aggregate bitrate limit.

ExampleBandwidth Allocation
A 4 Mbps pinned track and four simulcast tracks4 Mbps is allocated to the pinned track and the other simulcast tracks receive 2 Mbps each.
A 4 Mbps pinned track and two 2 Mbps tracksThe overall bitrate is is under the 12 Mbps limit.
A 12 Mbps pinned track and four simulcast tracks12 Mbps is allocated to the pinned track and other tracks receive no bandwidth
A 10 Mbps pinned track and two 2 Mbps tracks10 Mbps is allocated to the pinned track and there is only space for one additional track