The JavaScript SDK lets you connect, capture, publish, and subscribe to streams using the Dolby.io Streaming Platform.

Getting started

Importing the library

You can import the SDK through JSDELIVR CDN or via npm.

Import as script

Add this tag to your document's <head>:

<script src="https://cdn.jsdelivr.net/npm/@millicast/sdk/dist/millicast.umd.min.js"></script>

📘

For the latest version of the SDK and URL to use, check out the latest JSDELIVR CDN.

You can import in other formats like ESM besides UMD.

When the page loads and you import the script as UMD, the SDK modules become available to use through window.millicast variable.

Example of getting browser capabilities in UMD format:

const capabilities = window.millicast.PeerConnection.getCapabilities('video');
console.log(capabilities);

If you choose to import the script as ESM, you can use directly the modules with the import statement.

Example of getting browser capabilities in ESM format:

import { PeerConnection } from 'https://cdn.jsdelivr.net/npm/@millicast/sdk/dist/millicast.esm.js';

const capabilities = PeerConnection.getCapabilities('video');
console.log(capabilities);

Install as a dependency

Alternatively, you can install the package using the NPM.JS Package Manager:

npm i --save @millicast/sdk

And then using it with the import statement.

Example of getting browser capabilities using the SDK module:

import { PeerConnection } from '@millicast/sdk';

const capabilities = PeerConnection.getCapabilities('video');
console.log(capabilities);

Publish a stream

📘

You will need to find or create a new stream name with a token in your Dolby.io dashboard. You can do that following this link.

The main module to publish a stream is the Publish module.

Instantiate the Publish module

To create a new publisher, you have to instantiate the Publish class:

import { Director, Publish } from '@millicast/sdk';

const tokenGenerator = () => Director.getPublisher(
{
  token: 'my-publishing-token', 
  streamName: 'my-stream-name',
});
const publisher = new Publish('my-stream-name', tokenGenerator);

In order to create an instance of the Publish class, you have to send two parameters.

  • The first one is the stream name ('my-stream-name') and it is used to set where do you want to publish your stream.
  • The second parameter is a callback (tokenGenerator) and it is used to get the necessary data of the stream name where you want to stream. Also, it is used to set auto reconnection if you are streaming and then lose connection. You have to set the stream name and the token of those.

More information here:

Get your media information

You should get the media information (camera/microphone) you want to stream with. To do that, the SDK uses MediaStream object.

In this example, we are going to get the default user camera and microphone using the browser API getUserMedia():

const mediaStream = await navigator.mediaDevices.getUserMedia({ audio: true, video: true });

This will ask permission in the browser to get the camera and microphone and it will save the mediaStream information if the user accepts.

Initialize the stream

After all previous steps are completed, you can now initialize the stream using the connect() method inside the Publish module.

Using the instance of Publisher and mediaStream, add the connect method:

// Publishing Options
const publishOptions = {
  mediaStream: mediaStream,
};

// Start publishing a stream
try {
  await publisher.connect(publishOptions);
} catch (e) {
  console.error('Connection failed, handle error', e);
}

Once the connect() promise resolves, the stream is initialized correctly, otherwise it'll throw an error.

The connect() can optionally receive more parameters, all of these are described in the connect() method.

For example:

  • If you want to start your stream with a bitrate limit, you can use the bandwidth option.
  • If your stream token in Dolby.io Real-time Streaming has the recording enabled, you can enable it with the record option. Once you have finished your stream, you can see the recording in the Dashboard Recordings section.
  • You can start a stream without audio or video setting the disableAudio or disableVideo respectively.
  • You can select which codec you want to stream using the codec option. If you want to know the supported codecs of your browser, you can use getCapabilities()
import { PeerConnection } from '@millicast/sdk';

const capabilities = PeerConnection.getCapabilities('video');
console.log(capabilities);

// Output:
// [
//     {
//         "codec": "vp8",
//         "mimeType": "video/VP8",
//     },
//     {
//         "codec": "vp9",
//         "mimeType": "video/VP9",
//     },
//     {
//         "codec": "h264",
//         "mimeType": "video/H264"
//     },
//     {
//         "codec": "av1",
//         "mimeType": "video/AV1",
//     }
// ]

More information here:

Managing your active stream

📘

The next samples use the publisher variable we created in the past steps. Please follow the tutorial.

The Publish instance internally uses most of the classes described in the documentation.
If the method is not static (which means you can access it without a Publish instance), you can access it through the Publish instance.

Example:

publisher.signaling // Corresponds to the Signaling module.
publisher.webRTCPeer // Corresponds to the Peerconnection module.

Most of the important methods are located directly in the Publisher module but some are not, like updating the bitrate or getting the WebRTC stats.

Change bitrate

During an active stream, you can limit the maximum bitrate in kbps.
To do that, you have to use the updateBitrate() method of your Publish instance that is located inside a subclass called webRTCPeer.

Example:

await publisher.webRTCPeer.updateBitrate(2000); // Set the active stream with 2000 kbps limit.
await publisher.webRTCPeer.updateBitrate(0); // Set the active stream with no limit.

Stop stream

If you want to stop an active stream, you can use the stop() method of your Publish instance.

Example:

publisher.stop();

Screensharing

You can enable screensharing from a browser as part of the broadcast. If you want to enable screen sharing and share your webcam at the same time, you must enable multisource since you are broadcasting multiple streams. Alternately, you can share your screen using OBS.

For screen sharing browser support see this article on Browser Compatibility for Media Devices.

To enable screensharing, first you must define displayMediaOptions:

const displayMediaOptions = {
  video: {
    displaySurface: 'window',
  },
  audio: false,
};

Next, call the getDisplayMedia function. For more information, see getDisplayMedia.

const screenCapture = await navigator.mediaDevices.getDisplayMedia(displayMediaOptions);

Set the screenCapture to the mediaStream object:

const broadcastOptions = {
  mediaStream: screenCapture,
};

The following example creates a publisher object with the token and streamName. Then we connect using the publisher object and broadcastOptions.

import { Director, Publish } from '@millicast/sdk';

const tokenGenerator = () => Director.getPublisher({
  token: 'Publishing Token',
  streamName: 'Stream Name',
});

const publisher = new Publish('Stream Name', tokenGenerator);

try {
  await publisher.connect(broadcastOptions);
} catch (e) {
  console.error('Connection failed, handle error', e);
}

Viewer

📘

In order to connect to a stream you'll need the account id and stream name of the broadcast. This information must be provided by the publish owner. For more information, see Publishing API.

The main module to view a stream is the View module.

Instantiate the View module

Creating a new View instance:

import { Director, View } from '@millicast/sdk';

// Create callback to generate a new token
const tokenGenerator = ()  => Director.getSubscriber({
  streamName: 'publish-stream-name',
  streamAccountId: 'publish-account-id',
  // Optional: This token is needed if you're subscribing to a secure stream,
  // This token should be provided by the publish owner.
  subscriberToken: 'subscriber-token',
});

// Create Millicast instance
const millicastView = new View('publish-stream-name', tokenGenerator);

Like when creating a Publish instance, in order to create a View instance you need two parameters and there are also two optional parameters.

  • The first one is the stream name ('my-stream-name') and it is used to set the stream you want to connect.
  • The second parameter is a callback (tokenGenerator) and it is used to get the necessary data of the stream name where you want to connect. Also, it is used to set auto reconnection if you are streaming and then lose connection. You have to set the stream name and the token of those.
  • (Optional) mediaElement. This is the HTML media element where you want to mount the stream for example a video element.
  • (Optional) autoReconnect. The Default value is true, enabling auto reconnect to stream.

More information here:

Track event

In case the mediaElement parameter is not specified when creating a View instance a track event will be emitted when the stream starts containing the media track.

millicastView.on('track', (event) => {
  addStreamToYourVideoTag(event.streams[0]); // Manage the track event.
});

Connecting to a stream

After the previous steps are done then you can connect to the stream using the connect() method from the View instance.

try {
  await millicastView.connect(options);
} catch (e) {
  console.error('Connection failed, handle error', e);
  await millicastView.reconnect();
}

We recommend using the reconnect() method in the catch clause to make sure the View instance keeps trying to reconnect until the stream is live even when an error in the connect occurs.

Reconnect

Both the instances of Publish and View have a reconnect event. This event emits when the connection is lost with a timeout and error message in the callback. This event is emitted by the reconnect() method that both Publish and View instances have, this method is automatically called by default except specified with the autoReconnect flag of the constructor. SDK Reconnect event

Example reconnect event in View instance:

import { View } from '@millicast/sdk';

const millicastView = new View(streamName, tokenGenerator);

millicastView.on('reconnect', ({timeout, error}) => {
  console.log(timeout);
  console.error(error);
});
  • The timeout is the time, in milliseconds, when is going to retry the connection. It starts at 2000 ms and then 4000 ms ... until 32000 ms (32 s). After it goes to 32 s then it keeps retrying in 32 s.
  • The error message contains the cause of failure.

More information here:

Logger

You can get the logs of the connection, SDP, errors, and more. It can be enabled through the Logger module and you can activate it at any time (before/after a new connection).

import { Logger } from '@millicast/sdk';

Logger.setLevel(Logger.DEBUG); // Set level visibility to DEBUG and prior

The Logger is always logging, even if the setLevel() is disabled.
You can access the latest logs history using the .getHistory() method.

Logger.getHistory();

// Output
// [
//   "[Director] 2021-04-05T14:09:26.625Z - Getting publisher connection data for stream name:  1xxx2",
//   "[Director] 2021-04-05T14:09:27.064Z - Getting publisher response",
//   "[Publish]  2021-04-05T14:09:27.066Z - Broadcasting"
// ]

More information and examples here:

WebRTC stats

Stream stats can be accessed by both Publish and View instances. In order to capture them, you have to initialize the stats and then listen for the event through the webRTCPeer attribute.

🚧

Stats might defer between different web browsers.

Example of usage:

import { View } from '@millicast/sdk';

// Initialize and connect your Viewer
const millicastView = new View(streamName, tokenGenerator);
await millicastView.connect();

// Initialize stats
millicastView.webRTCPeer.initStats();

// Capture new stats from event every second
millicastView.webRTCPeer.on('stats', (stats) => {
  console.log('Stats from event: ', stats);
});
import Publish from '@millicast/sdk'
//Initialize and connect your Publisher
const millicastPublish = new Publish(streamName, tokenGenerator)
await millicastPublish.connect()

//Initialize stats
millicastPublish.webRTCPeer.initStats()

//Capture new stats from event every second
millicastPublish.webRTCPeer.on('stats', (stats) => {
   console.log('Stats from event: ', stats)
})

We highly recommend stopping the stats when they're not being used.

Example of stopping stats:

millicastView.webRTCPeer.stopStats();
millicastView.webRTCPeer.removeAllListeners('stats'); // Removes listeners if it has any

More information here:

Broadcast events

In the connect method, you can send the events that you want to subscribe to, when the connection is done the SDK starts to emit an event called broadcastEvents to catch the data of each event:

await millicastView.connect({
  events: [
    'active',
    'inactive',
    'stopped',
    'vad',
    'layers',
    'migrate',
    'viewercount',
  ]
});
  • active: Fires when the live stream is starting, or has started broadcasting.
  • inactive: Fires when the stream has stopped broadcasting but is still available.
  • stopped: Fires when the live stream has been disconnected and is no longer available.
  • vad: Fires when the live stream is using multiplexed tracks for audio.
  • layers: The live stream has to be broadcasting with simulcast. Fires when there is an update of the state of the layers in the live stream.
  • migrate: Fires when the server is having problems, is shutting down or when viewers need to move for load balancing purposes.
  • viewercount: Fires when the number of viewers changes in the stream published.

Managing online/offline status

The broadcastEvents can be used to detect when a stream is live or offline. There are two approaches to do this, it depends if you are contemplating multisource or not.

  • If you are publishing without multisource, the broadcastEvent could be active which means that the stream is live. When it is inactive means that the stream is offline.

  • If it is a multisource stream you have the same events but you are going to receive a source ID in the data payload. If you receive it as an active event, it means that a new source ID is available and the stream is active.
    To detect if the stream is inactive, you have to receive an inactive event and remove the incoming source ID from your active ones.
    If you don't have active source IDs, it means that the stream is offline.

More information here:

Example of usage (contemplating both cases):

const view = new View(streamName, tokenGenerator, video);

const activeSources = new Set();

view.on('broadcastEvent', (event) => {
  const {name, data} = event;
  switch (name) {
    // There is a new active source
    case 'active':
      activeSources.add(data.sourceId);
      console.log('Active Stream.');
      break;
    // A source became inactive
    case 'inactive':
      activeSources.delete(data.sourceId);
      if (activeSources.size === 0) {
        console.log('No active Stream.');
      }
      break;
    default:
      break;
  }
});

try {
  await view.connect();
} catch (e) {
  console.error('Connection failed, handle error', e);
}

User count

Also with the broadcastEvent you can manage the number of active viewers in the stream. When receive the event with the name 'viewercount' in it you can get the updated viewercount info.

Example of usage:

view.on('broadcastEvent', (event) => {
  const {name, data} = event;
    //...
    case 'viewercount':
      updateViewerCount(event.data.viewercount);
      break;
    default:
      break;
  }
})