- Supported Unreal Engine version 5.0.0 and later
- Supported Unreal Engine version 4.27
- Supported on Windows and Linux
This plugin enable to publish game audio and video content to the Dolby.io Real-time Streaming. You can configure your credentials and configure your game logic using unreal object, capture and publish from virtual camera.
The publisher plugin supports VP8, VP9, or H264 video encoding on supported platforms when available.
You can install the plugin from the source code.
Follow these steps :
- Create a project with the UE editor.
- Close the editor
- Go at the root of your project folder (C:\Users\User\Unreal Engine\MyProject)
- Create a new directory "Plugins" and move into it
- Clone the Millicast repository :
git clone https://github.com/millicast/millicast-publisher-unreal-engine-plugin.git MillicastPublisher
- Open your project with UE
It will prompt you, saying if you want to re-build MillicastPublisher plugin, say yes.
You are now in the editor and can build your game using MillicastPublisher.
Note: After you package your game, it is possible that you will get an error when launching the game :
"Plugin MillicastPublisher could not be load because module MillicastPublisher has not been found"
And then the game fails to launch.
That is because Unreal has excluded the plugin.
If that is the case, create an empty C++ class in your project. This will force Unreal to include the plugin. Then, re-package the game, launch it, and it should be fixed.
To enable the plugin, open the plugin manager in Edit > Plugins.
Then search for MillicastPublisher. It is in the category "Media". Tick the "enabled" checkbox to enable the plugin. It will prompt you if you are sure to use this plugin, because it is in beta, just accept. After that Unreal will reboot in order to load the plugins.
If it is already enabled, just leave it like this.
Basically, you have several Unreal objects to configure in order to publish a stream from your game to Dolby.io Real-time Streaming.
We will see first how to create a Dolby.io Real-time Streaming source, setup your credentials and add video/audio source, and then how to use the blueprint to implement the logic of the game.
MillicastPublisherSource object allow you to configure your Dolby.io Real-time Streaming credentials and manage the video and audio sources you want to publish to Millicast. To add a
MillicastPublisherSource, add a new asset by clicking "Add/Import", and you will see the object in the "Media" category.
Then, you can double-click on the asset you just created, so you can start configuring it.
First, you have the Dolby.io Real-time Streaming credentials:
- the stream name you want to publish to
- your publishing token
- the source id, for now you can leave it blank, we will see how to use it to configure multisource.
- The publish api URL, which usually is https://director.millicast.com/director/api/publish
You can find all this information in your Dolby.io dashboard.
Below, you can see that we can configure the video and audio sources. Both have a checkbox, to tell whether you want to disable it or not.
We will see in next section how to configure video and audio source.
For the video capture, you can either capture the game screen or capture from a render target 2D object. Capturing from a device (e.g. webcam) is not yet supported.
In the video source section, you can see a field named "RenderTarget". This is to specify a specific render target, if you leave it to none, it will create a Slate Window capturer, which is basically a screen capture of the game.
For example, if you launch the game from the editor, you will get this kind of output :
And if launch from the game you can get this output. And if you log some messages you will see it in the stream.
Now, we will see how to use a specific render target. This allow to set up virtual camera in your game and capture the scene under different angle and position.
To do so, start by adding "Scene Capture 2D" actor to the level.
Select it and, in the parameters create a "Render target 2D".
Save the render target in your assets and then double click on it to open its settings.
You can modify the size as you wish, for instance 1920x1080. Then, set the render target format to
Finally, go back to the Millicast Publisher source and set this render target.
You might encounter some color issue when using a
SceneCapture2D. When you publish the game content, and viewing not in an unreal game (in the webviewer for instance), the image can be a bit darker. This is because, the frame is missing some gamma correction that is applied to the unreal viewport.
In order to apply this gamma correction, you can use the
MillicastCameraActor which is just a cine camera, capturing with a specific viewport component. You can set the capture size and a
RenderTarget2D to capture from this camera.
Capturing from the millicast camera object or from a capture scene can take some CPU. So, if you have multiple camera in your scene, it might be good to
deactivate the camera that are not used for publishing, and
activate only those actually capturing to save some CPU load.
You can capture either the audio from the game or audio coming from a device (e.g. microphone, audio driver).
To choose if you want to capture from a submix object or from a device, you have a line "audio capture type" in the audio section where you can choose "Submix" or "device".
You can capture the audio game content from a submix object. If you add any sounds, music in your assets and it will be captured when playing the game.
You can either specify your own submix object and let it empty. If you let empty, the audio will be capture from the master submix of the main audio engine.
In order to enable the audio capture, you must launch the game with the
In order to launch your game with this parameter, go in "Edition -> Editor Preferences". Under "Play as a standalone game, add
If you choose to capture from an audio device, you can select the audio device directly from the
PublisherSource menu but is not really recommended as it would be only for your machine.
You can dynamically set an audio device from its id or index using the blueprints.
In your blueprint, you just have to create a public variable of type
MillicastPublisherSource. it will appear in the blueprint menu, just set your source asset.
In the source menu, you can see that there is a volume multiplier field. This is intended to boost the volume of the recording, because when hearing the recording, the sound might be fainted.
Now that audio and video sources are all set up, let's implement the blueprint.
Add a blueprint class in your assets (choose actor) and add it in your level. Now open it and go into the event graph.
First, click "Add Component" and add a
MillicastPublisherComponent. In the properties, add the Millicast Publisher source and then drag it in the graph, and call the function
Publish. This will start the capture and publish to Dolby.io Real-time Streaming. It will authenticate you through the director API and then create a websocket connection with Dolby.io Real-time Streaming and finally establish a WebRTC peer connection to publish to Dolby.io Real-time Streaming.
There are several event emitted by the publisher component :
OnAuthenticated: Called when the authentication through the director API has been successful.
OnAuthenticatedFailure: Called when the authentication through the director API has failed.
OnPublishing: When you start sending media to Dolby.io Real-time Streaming
OnPublishingError: When the publishing step failed, which could be an error during the websocket signaling or setting up the WebRTC peerconnection.
OnActive: This event is called when the first viewer starts subscribing to the feed.
OnInactive: This event is called when the last viewer stops subscribing to the feed.
This is an example of blueprint publishing when the game starts and muting the video if the stream is inactive, and unmuting it when the stream is active and finally unpublish when the game ends.
It is possible to use the multisource feature of Dolby.io Real-time Streaming in order to publish several audio/video source with the same publisher. Consequently you can use several virtual video camera from the game as video sources.
To do so, add several
SceneCapture2D in your scene and add as much
Configure the render target for each one, and in each
MillicastPublisherSource asset, you must set the
sourceId field to a unique value. Regarding the blueprint, you must add the same number of publisher component than
MillicastPublisherSource assets and then the rest remain the same as if you publish from only one source.
Updated about 1 month ago