In this guide, we demonstrate how to play livestreams in your application.Documentation Index
Fetch the complete documentation index at: https://na-36-docs-v2.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Using the UI Kit Player
The example below show to use the UI KitPlayer to
play a livestream.
Play Video
This guide assumes you have configured a Livepeer JS SDK client with an API key. We use thePlayer with a playbackId, which we
created previously when creating a livestream.
DemoPlayer.tsx
Using your own player
Fetch the playback URL
To play back a livestream in other players, you’ll need to fetch the playback URL(s). By default, all content has an HLS endpoint. HLS is a protocol that allows you to stream video and audio content over HTTP. Much of the video you watch on the web is delivered using HLS. Livepeer uses HLS to deliver video and audio content. We also support WebRTC WHEP low latency playback - however, ecosystem player support is limited, as it is a new spec that is rapidly gaining traction. Below, we show how to fetch playback info in Typescript using the playback info API endpoint, but we have a similar interface across all SDKs.DemoPlayer.tsx
Please note that to play back livestreams inside your application you’ll need
to use a video player component that supports HLS or WebRTC WHEP.
Handling various playback sources
The playback info endpoint can return multiple sources in the response, as outlined above. WebRTC URLs for low latency livestream playback must be played back with our ICE servers, which are used to route traffic in restricted networking environments. The WebRTC WHEP negotiation will send back these STUN/TURN servers in the SDP response headers, which can be used in a player. If there is WebRTC playback available, the API will return a JSON payload similar to:Use the playback URL in a player
You can use the playback URL with any video player that supports HLS. Here is a list of popular players that support HLS:- Video.js
- Plyr.io
- JW Player
- Bitmovin Player
- HLS.js (requires custom logic to wire to a video element)
Embeddable Player
Livepeer Studio maintains an embeddable version of the Livepeer Player that is suitable for iframing. This is one of the easiest ways to play back a livestream on your website. You can embed the player by using the below code snippet. You can replace thePLAYBACK_ID with your video’s playback id.
Low Latency
In the embeddable player, livestreams will, by default, play back with low-latency WebRTC. If this does not succeed in playing back (rarely, usually due to a slow network or connectivity issues), the embeddable player will automatically fall back to HLS playback. Also, if the stream contains B-frames (or bidirectional frames, which are common for users streaming with OBS or other streaming apps), the Player will automatically fall back to HLS, so that out-of-order frames are not displayed. This only applies to users who are playing livestreams. If you do not want to use WebRTC, you can pass&lowLatency=false in the query
string, or if you want only low latency, you can pass &lowLatency=force.
Clipping
To enable clipping,&clipLength={seconds} can be passed, which will allow
viewers to clip livestreams. The length in seconds must be less than 120
seconds.
Constant Playback
The embed supports “constant” playback withconstant=true, which means that
audio will not be distorted if the playhead falls behind the livestream. This is
usually used for music applications, where audio quality/consistency is more
important than latency.
Other Configs
You can also override the defaultmuted and autoplay behavior with
&muted=false and/or &autoplay=false. These are set to true by default.
Looping can also be set with &loop=true.