Gstreamer

How-to Broadcast Using Gstreamer with WHIP

GStreamer is a free open-source software project and multimedia framework to build media processing pipelines that support complex workflows. You may want to broadcast over WebRTC from a file on disk or another Real-time Streaming Protocol (RTSP). You can originate the broadcast through GStreamer that ingests the stream utilizing WHIP.

See the official gstreamer.freedesktop.org documentation for installation instructions and additional support.

Get Your Dolby.io WHIP Publish URL

You will need a WHIP endpoint and Bearer token in order to broadcast. From the Dolby.io Dashboard, navigate to the Publishing tab of your token. Under the Live broadcast - Publish tokens section, retrieve the WHIP endpoint and Bearer token.

👍

Getting Started

If you haven't already, begin by following the Getting Started tutorial to create a Dolby.io application and start your first broadcast. You will need a publish token for the steps described below.

See the WHIP broadcast protocol guide for more specific details.

GStreamer

The 1.22.0 release of GStreamer (January 2023) includes support for WebRTC that enables:

  • WebRTC HTTP ingest (WHIP) to a MediaServer (whipsink)
  • WebRTC HTTP egress (WHEP) from a MediaServer (whepsrc)

The whipsink element can be used to publish the end of a pipeline to a real-time stream. The attributes that must be defined:

  • auth-token: should be set with your publishing bearer token
  • whip-endpoint: should be set with the Dolby.io WHIP endpoint

How-to Broadcast the Video Test Source

The videotestsrc element can be used to produce a simple test video.

To view this locally run gst-launch-1.0 videotestsrc ! autovideosink:

To publish this to a real-time stream you can use a command with the following encoding settings:

gst-launch-1.0 videotestsrc ! videoconvert ! x264enc ! rtph264pay ! \
  'application/x-rtp,media=video,encoding-name=H264,payload=97,clock-rate=90000' ! \
  whip.sink_0 audiotestsrc wave=5 ! audioconvert ! opusenc ! rtpopuspay ! \
  'application/x-rtp,media=audio,encoding-name=OPUS,payload=96,clock-rate=48000,encoding-params=(string)2' ! \
  whip.sink_1 whipsink name=whip \
  auth-token=$DOLBYIO_BEARER_TOKEN \
  whip-endpoint=$DOLBYIO_WHIP_ENDPOINT

You can then watch this stream from a playback viewer.

How-to Broadcast a Media File with a Specific Codec

Instead of using the video test source, this example demonstrates reading and looping a media file from disk using the multifilesrc element. In addition, if there is a requirement to use a specific codec such as VP9 you should use the codec query parameter of the WHIP endpoint.

 gst-launch-1.0 multifilesrc location=/home/Videos/test/YourFile.mp4 loop=true ! \
 qtdemux ! decodebin ! videorate ! videoconvert ! timeoverlay ! \
 video/x-raw,format='(string)I420' ! vp9enc target-bitrate=4000000 keyframe-max-dist=2 deadline=1 \
 end-usage=1 cpu-used=8 lag-in-frames=0 ! video/x-vp9,profile='(string)0' ! \
 rtpvp9pay pt=100 ssrc=2 ! whipsink name=whip \
 auth-token=$DOLBYIO_BEARER_TOKEN \ 
 whip-endpoint=$DOLBYIO_WHIP_ENDPOINT?codec=vp9

How-to Broadcast a RTSP Video Source

This example demonstrates using a RTSP source:

gst-launch-1.0 rtspsrc location="rtsp://192.168.1.168/0" latency=0 \
! application/x-rtp,media=video,encoding-name=H264 ! rtph264depay ! \
rtph264pay config-interval=-1 name=whip ! whipsink \
auth-token=$DOLBYIO_BEARER_TOKEN \
whip-endpoint=$DOLBYIO_WHIP_ENDPOINT 

Simple WHIP Client

The Simple WHIP client is an open-source implementation of a WHIP client that can be useful for testing. It requires a build of the C libraries including GStreamer.

You can find more details on the GitHub project:

https://github.com/meetecho/simple-whip-client

Testing your setup

First test your setup by using a video test source to a WHIP GStreamer pipeline.

Note: If you have recording enabled, you will need to add ?codec=h264.
For the example below, if recording is enabled: https://director.millicast.com/api/whip/streamName and ?codec=vp8.

./whip-client -u https://director.millicast.com/api/whip/kxhqovek \
-t 09598571d36bc70dd59871be7a322d0fd688d05decbd619db88ced1f780987a7 \
-V "videotestsrc is-live=true pattern=ball ! videoconvert ! queue ! 
    vp8enc deadline=1 ! rtpvp8pay pt=96 ssrc=2 ! queue !
    application/x-rtp,media=video,encoding-name=VP8,payload=96"

If everything is correct, you should see a command output like this:

1573

And if you connect to the viewer, a bouncing ball will be shown:

Publishing an RTSP video-only source

Now, to connect to an RTSP source like an axis camera, you need to replace the GStreamer pipeline with one that connects to the Camera and passes the video data to Dolby.io Real-time Streaming without transcoding the content.

./whip-client -u https://director.millicast.com/api/whip/kxhqovek \
  -t  09598571d36bc70dd59871be7a322d0fd688d05decbd619db88ced1f780987a7 \
  -V "rtspsrc location=rtsp://98.100.xxx.xxx:5545/axis-media/media.amp latency=0 name=rtsp !
      rtph264depay ! rtph264pay config-interval=-1 !
      application/x-rtp,media=video,encoding-name=H264"

Publishing an RTSP audio and video source

If your camera also supports audio, you just need to add the GStreamer audio pipeline:

./whip-client -u https://director.millicast.com/api/whip/kxhqovek \
  -t 09598571d36bc70dd59871be7a322d0fd688d05decbd619db88ced1f780987a7 \
  -A "rtsp. ! decodebin ! audioconvert ! audioresample !
      audiobuffersplit output-buffer-duration=2/50 ! queue ! opusenc !
      rtpopuspay pt=100 ssrc=1 ! queue !
      application/x-rtp,media=audio,encoding-name=OPUS,payload=100" \
  -V "rtspsrc location=rtsp://98.100.xxx.xxx:5545/axis-media/media.amp latency=0 name=rtsp !
      rtph264depay ! rtph264pay config-interval=-1 !
      application/x-rtp,media=video,encoding-name=H264"

Publishing an RTSP source with authentication

Finally, if your setup requires authentication, pass the username and password to the GStreamer RTSP plugin

./whip-client -u https://director.millicast.com/api/whip/kxhqovek \
	-t 09598571d36bc70dd58971be7a322d0fd688d05decbd619db88ced1f780987a7 
	-A "rtsp. ! decodebin ! audioconvert ! audioresample ! 
      audiobuffersplit output-buffer-duration=2/50 ! queue ! opusenc !
      rtpopuspay pt=100 ssrc=1 ! queue ! 
      application/x-rtp,media=audio,encoding-name=OPUS,payload=100" \
  -V "rtspsrc location="rtsp://admin:[email protected]:855/live latency=0 name=rtsp 
      user-id=admin user-pw=admin ! rtph264depay ! rtph264pay config-interval=-1 ! 
      application/x-rtp,media=video,encoding-name=H264"