Premium and Enterprise members have the option of reducing the delay of their streams from the source to their viewers by enabling low-latency streaming in their event settings.
This feature is currently in a beta test. If you would like to be opted into the beta test, please contact support.
When you create a new event (or open an existing recurring event), you will be brought to the live settings page, with the Event tab open by default. Below this tab is where you can find the Low-latency streaming toggle switch.
If you toggle it on, the delay of your stream from the source to your viewers will be reduced to less than 10 seconds (without it, latency can range from 15-25 seconds).
- If you enable low-latency streaming, you will not be able to enable auto closed captions and vice versa.
- If your organization also utilizes the corporate eCDN feature on the account, this will override the low-latency setting on your event. Currently, these features cannot be used together. If you wish to disable eCDN in order to use low latency, please contact us.
If you enable low-latency streaming for a recurring event, the setting will remain on for all future streams to that event (unless you manually disable it). Low-latency streaming cannot be turned on or off mid-event; you would need to toggle the switch before you start streaming.
What contributes to latency?
Latency--which is the term we use to refer to the time the camera captures the content to the time your viewers see it--has many varying factors that contribute to it. It generally comes down to the relationship between network congestion, CPU resources, and the stream’s quality (bitrate and resolution) throughout the stream’s journey from the source to the player.
A typical stream works like this:
- Production workflow: This is your camera(s), video mixer, encoder, everything that you set up locally to produce your stream. It could be as simple as a webcam using our browser encoder or as complex as an external encoder with multiple camera inputs. The main factors here for latency are your encoder’s CPU capacity and network speed/bandwidth.
- Vimeo processes the stream: We ingest and decode the incoming stream, then re-encode it into multiple qualities and a more consumable format for viewing. The low-latency toggle does not impact this process.
- CDN sends content to the player: We use content delivery networks (CDNs) to distribute your content all over the world to then playback in the Vimeo player. As part of this process, the player keeps a certain amount of content pre-loaded (or “buffered”) to ensure a smooth playback experience should there be a network-related slowdown prior to this point. Enabling low-latency streaming will reduce the amount of content the player has prepared prior to playing back to the viewer.
At any point in this process, there is potential for stream back-up and network slowdown, which could be outside of your or Vimeo’s control. When we build up a buffer in the player, that allows time for any network-related slowdowns to catch up if while still providing a smoother and high-quality playback experience for the viewer even if they are a little further behind. This also allows the player to download the highest possible quality when all systems are functioning without delay.
Why wouldn't I always use low latency mode?
We know our users are eager to reduce the delay of their live events to their viewers. However, there is a trade-off when enabling low-latency, in that when your stream moves from your encoder to our ingest servers and CDN to your viewers, it skips some of the steps we’d otherwise take to ensure a reliable and high-quality viewer experience.
Regardless of the latency setting, the player will create different qualities and pre-load video in the player before displaying it to the viewer. The difference between default latency and low latency is how much of the video is pre-loaded and prepared (also referred to as buffered) for playback. With low latency enabled, the player prepares less content at a more frequent rate, which can create less margin for error should there be a network issue in the process.
Your viewers may also see a reduction in video quality if there are network slowdowns in the process that prevent the player from quickly adapting to the highest possible bitrate if the highest quality isn’t yet buffered. Keeping the default latency setting would allow more time for higher-quality video to load.
If your content is more time-sensitive and/or requires closer to real-time interaction with your audience, that’s the ideal use case for enabling low latency. However, be mindful that if there is network congestion or CPU limitations along the way it’s more likely that your viewers will experience buffering or low-quality streaming in the player.
Events that do not need much interaction with viewers will not see any benefit of a low-latency stream, and as such we would recommend keeping low-latency turned off.
As always, we strongly recommend testing and rehearsing prior to your event.