HLS Archives - StreamShark Mon, 24 Mar 2025 23:18:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 Understanding Stream Latency: HLS, LL-HLS and StreamShark’s Optimized Latency https://streamshark.io/blog/understanding-stream-latency-hls-ll-hls-and-streamsharks-optimized-latency/ https://streamshark.io/blog/understanding-stream-latency-hls-ll-hls-and-streamsharks-optimized-latency/#respond Mon, 24 Mar 2025 23:12:04 +0000 https://streamshark.io/?p=116463 In the streaming world, latency, the delay from capturing a live event to the playback on viewer’s screen, is a critical factor for both content creators and audiences. The content format and interactivity requirement further influence operational decisions to strike a balance between reliability and interactivity, as discussed by James here not too long ago. At StreamShark, we are constantly evolving our technologies to deliver the best balance between latency, reliability and scalability. In this blog post, we’ll review how latency behaves among StreamShark’s Event platform when using RTMP input with HLS output, the benefits of low-latency HLS (LL-HLS), and our latest Optimized Latency solution, which is now available, for improved real-time experiences. Standard HLS: A Reliable but Delayed Experience When streaming using RTMP as input and HLS as output, the standard HLS configuration typically introduces a 20-30 seconds delay. Standard HLS has been a trusted solution for many years due to its proven: However, for use cases requiring low latency or near real-time interaction, such as live sports, auctions, or interactive broadcasts with features like polling and Q&A, this delay may hinder the overall experience and responsiveness. Low-Latency HLS: Reduce Delay to ~5 seconds LL-HLS was introduced at StreamShark as an Event option from 2023, it significantly reduces playback latency to around 5 seconds. This is achieved by utilizing smaller segment size (typically 1-2 seconds compared to the standard 6 seconds) and delivering partial segments progressively for improved responsiveness. This makes LL-HLS an excellent option for interactive use cases where real-time viewer engagement is critical. However, this approach comes with some limitations: While LL-HLS is a game-changer for delivering low-latency streams, these trade-offs may require additional infrastructure and careful planning to ensure consistent performance. Introducing StreamShark’s Optimized Latency While LL-HLS is a powerful option, we understand that not every use case can fully adopt its model due to scalability, security, or compatibility concerns. Moreover, not every use case needs to reach ~5 seconds latency. To provide a middle ground, we are excited to announce the Optimized Latency event type. This approach balances latency, reliability, and performance to deliver streams with a latency of 10-12 seconds. Key Features of Optimized Latency RTMP: Here’s a quick comparison: Type Standard HLS LL-HLS Optimized Latency Latency 20-30 seconds ~5 seconds 10-12 seconds Reliability High Moderate to high High Scalability Excellent Good, but resource intensive Excellent Security Strong encryption & DRM Supported, but complex and not universal Strong encryption & DRM Segment Size 6 seconds 1-2 seconds 2 seconds Player Buffering Standard Reduced Optimized Device Compatibility Universal Still growing adoption Universal What’s Ahead At StreamShark, we recognize that every streaming use case is unique. Whether you prioritize low latency with LL-HLS, maximum reliability with standard HLS, or a balanced approach with our Optimized Latency, we offer solutions to meet your needs. Our goal is to ensure your audience gets the best experience possible, without sacrificing stability, scalability, or security. The Optimized Latency event type has been rolling out to selected customers. If you would like to try it to optimize your streaming workflows, contact our team today to learn how Optimized Latency can transform your live-streaming experience and elevate audience engagement.

The post Understanding Stream Latency: HLS, LL-HLS and StreamShark’s Optimized Latency appeared first on StreamShark.

]]>
In the streaming world, latency, the delay from capturing a live event to the playback on viewer’s screen, is a critical factor for both content creators and audiences. The content format and interactivity requirement further influence operational decisions to strike a balance between reliability and interactivity, as discussed by James here not too long ago.

At StreamShark, we are constantly evolving our technologies to deliver the best balance between latency, reliability and scalability. In this blog post, we’ll review how latency behaves among StreamShark’s Event platform when using RTMP input with HLS output, the benefits of low-latency HLS (LL-HLS), and our latest Optimized Latency solution, which is now available, for improved real-time experiences.

Standard HLS: A Reliable but Delayed Experience

When streaming using RTMP as input and HLS as output, the standard HLS configuration typically introduces a 20-30 seconds delay. Standard HLS has been a trusted solution for many years due to its proven:

  • Reliability: Capable of delivering high-bitrate streams seamlessly across all devices and platforms.
  • Scalability: Efficiently manages massive viewership without performance degradation.
  • Security: Easily integrates with encryption and DRM solutions.

However, for use cases requiring low latency or near real-time interaction, such as live sports, auctions, or interactive broadcasts with features like polling and Q&A, this delay may hinder the overall experience and responsiveness.

Low-Latency HLS: Reduce Delay to ~5 seconds

LL-HLS was introduced at StreamShark as an Event option from 2023, it significantly reduces playback latency to around 5 seconds. This is achieved by utilizing smaller segment size (typically 1-2 seconds compared to the standard 6 seconds) and delivering partial segments progressively for improved responsiveness.

This makes LL-HLS an excellent option for interactive use cases where real-time viewer engagement is critical. However, this approach comes with some limitations:

  • Reliability & Scalability: While LL-HLS performs well, ensuring reliability and maintaining scalability at high viewership and bitrate quality levels can be resource-intensive compared to standard HLS
  • Security: LL-HLS supports encryption and DRM, but its implementation is often complex and less universally supported.
  • Compatibility: Adoption of LL-HLS is growing, but it is yet to be universally compatible across all devices and players.

While LL-HLS is a game-changer for delivering low-latency streams, these trade-offs may require additional infrastructure and careful planning to ensure consistent performance.

Introducing StreamShark’s Optimized Latency

While LL-HLS is a powerful option, we understand that not every use case can fully adopt its model due to scalability, security, or compatibility concerns. Moreover, not every use case needs to reach ~5 seconds latency. To provide a middle ground, we are excited to announce the Optimized Latency event type. This approach balances latency, reliability, and performance to deliver streams with a latency of 10-12 seconds.

Key Features of Optimized Latency RTMP:

  • Reduced Segment Size: By lowering segment size from 6 seconds to 2 seconds, stream delivery is sped up.
  • Optimized Player Buffering: The player buffer time is optimized, cutting down on playback delays.
  • Support for All Existing Features: Unlike LL-HLS with certain limits and restrictions, all StreamShark platform features supported with Standard HLS – including DRM, DVR, Multistream, CMCD, Captioning and more – are supported with Optimized latency event type, ensuring no compromise while improving event delivery.
  • Compatibility: Optimized Latency event works seamlessly with existing players and browsers across all platforms.

Here’s a quick comparison:

TypeStandard HLSLL-HLSOptimized Latency
Latency20-30 seconds~5 seconds10-12 seconds
ReliabilityHighModerate to highHigh
ScalabilityExcellentGood, but resource intensiveExcellent
SecurityStrong encryption & DRMSupported, but complex and not universalStrong encryption & DRM
Segment Size6 seconds1-2 seconds2 seconds
Player BufferingStandardReducedOptimized
Device CompatibilityUniversalStill growing adoptionUniversal

What’s Ahead

At StreamShark, we recognize that every streaming use case is unique. Whether you prioritize low latency with LL-HLS, maximum reliability with standard HLS, or a balanced approach with our Optimized Latency, we offer solutions to meet your needs. Our goal is to ensure your audience gets the best experience possible, without sacrificing stability, scalability, or security.

The Optimized Latency event type has been rolling out to selected customers. If you would like to try it to optimize your streaming workflows, contact our team today to learn how Optimized Latency can transform your live-streaming experience and elevate audience engagement.

The post Understanding Stream Latency: HLS, LL-HLS and StreamShark’s Optimized Latency appeared first on StreamShark.

]]>
https://streamshark.io/blog/understanding-stream-latency-hls-ll-hls-and-streamsharks-optimized-latency/feed/ 0
What is Transcoding? Why is it Important? https://streamshark.io/blog/what-is-transcoding-why-is-it-important/ https://streamshark.io/blog/what-is-transcoding-why-is-it-important/#respond Thu, 02 May 2019 06:16:31 +0000 https://streamshark.wpengine.com/?p=8778 There are a plethora of technical terms thrown around in the video industry. Two of the most common terms are ‘transcoding’ and ‘encoding’. Whilst you can probably get away not knowing most of the technical details behind these terms, it’s good to have a basic understanding of where they fit in the streaming process. What is Encoding? It’s worth understanding what encoding is as it’s essentially the very first step that must occur to a raw video source. It’s intrinsically linked to transcoding, and more often than not it’s common to see people get confused about its meaning. Encoding is when an uncompressed and raw video source (like SDI, HDMI etc) is compressed using codecs such as H.264 or MPEG-2. It’s the step that must occur before transcoding takes place. What this means is large uncompressed video files are reduced in file size in order to enable playback on standard viewing devices, such as computers and mobile phones. What is Transcoding? Transcoding is when you take a video source that has already been encoded, decode it into an intermediate uncompressed format, then re-encode it into the target format. This process usually involves transrating and image scaling (trans-sizing). Transrating relates specifically to changing the bitrate of the video e.g. 8Mbps to 3Mbps. Image scaling is the process of changing the resolution of the video e.g. 4K to 720p. The end result is usually multiple video versions, with different bitrates, resolutions and/or formats (like HLS and DASH). Why is Transcoding Important? The importance of transcoding comes down to your requirements. In general, the purpose of transcoding is to give reach to the widest amount of viewers as possible. Some of the key reasons to use transcoding are listed below. Adaptive Bitrate Streaming (ABR) When using a format like HLS, it allows the video player to dynamically switch between videos sources depending on the viewers internet connection and device. E.g. switching between, 1080p and 720p versions of the video stream. Reduced Bandwidth Requirements Transcoding allows you to generate various video sources required for ABR from a single video input. This is great for people live streaming from environments with limited upload speeds. Custom Transcodes for Different Destinations When live streaming to multiple platforms, there are some which have different video input requirements. For example Periscope recommends an input resolution of 540p. Instead of reducing the resolution for all your streams to 540p, you can transcode the original video source, down to a seperate lower resolution stream used just for Periscope. Flexibility to Support Multiple Formats Transcoding allows you to re-encode your stream into multiple formats like HLS or MPEG-DASH. You might want to do this if you’re streaming to a range of devices which only support certain formats. Clean Up Encodes of Streams In some cases you may have no control over the original encoding of the video you’re live streaming. We often see this with hardware devices which encode RTMP directly on the device, e.g. a camera. As live streaming is usually not the expertise of these hardware manufacturers they generally don’t configure their RTMP encoder correctly, which results in a messy RTMP output. With transcoding you can fix these incorrect configurations and vastly improve the playback experience. What is Transmuxing? Transmuxing allows you to change the container (e.g. mp4) of the video to something else, like HLS. The advantage of this over transcoding is that it requires much less computer power to perform, as it’s not modifying the encode at all. The disadvantage of this approach is that it doesn’t give you the widest compatibility support that transcoding does e.g. multiple video resolutions. How To Transcode? The easiest way to transcode a live stream is to use a service, like us, StreamShark! It’s all built into our end-to-end live streaming platform, along with a huge range of other essential features. You can sign up for a free trial here.

The post What is Transcoding? Why is it Important? appeared first on StreamShark.

]]>
There are a plethora of technical terms thrown around in the video industry. Two of the most common terms are ‘transcoding’ and ‘encoding’.

Whilst you can probably get away not knowing most of the technical details behind these terms, it’s good to have a basic understanding of where they fit in the streaming process.

What is Encoding?

It’s worth understanding what encoding is as it’s essentially the very first step that must occur to a raw video source. It’s intrinsically linked to transcoding, and more often than not it’s common to see people get confused about its meaning.

Encoding is when an uncompressed and raw video source (like SDI, HDMI etc) is compressed using codecs such as H.264 or MPEG-2. It’s the step that must occur before transcoding takes place.

What this means is large uncompressed video files are reduced in file size in order to enable playback on standard viewing devices, such as computers and mobile phones.

What is Transcoding?

Transcoding is when you take a video source that has already been encoded, decode it into an intermediate uncompressed format, then re-encode it into the target format.

This process usually involves transrating and image scaling (trans-sizing). Transrating relates specifically to changing the bitrate of the video e.g. 8Mbps to 3Mbps. Image scaling is the process of changing the resolution of the video e.g. 4K to 720p.

The end result is usually multiple video versions, with different bitrates, resolutions and/or formats (like HLS and DASH).

Why is Transcoding Important?

The importance of transcoding comes down to your requirements. In general, the purpose of transcoding is to give reach to the widest amount of viewers as possible.

Some of the key reasons to use transcoding are listed below.

Adaptive Bitrate Streaming (ABR)

When using a format like HLS, it allows the video player to dynamically switch between videos sources depending on the viewers internet connection and device. E.g. switching between, 1080p and 720p versions of the video stream.

Reduced Bandwidth Requirements

Transcoding allows you to generate various video sources required for ABR from a single video input. This is great for people live streaming from environments with limited upload speeds.

Custom Transcodes for Different Destinations

When live streaming to multiple platforms, there are some which have different video input requirements. For example Periscope recommends an input resolution of 540p.

Instead of reducing the resolution for all your streams to 540p, you can transcode the original video source, down to a seperate lower resolution stream used just for Periscope.

Flexibility to Support Multiple Formats

Transcoding allows you to re-encode your stream into multiple formats like HLS or MPEG-DASH. You might want to do this if you’re streaming to a range of devices which only support certain formats.

Clean Up Encodes of Streams

In some cases you may have no control over the original encoding of the video you’re live streaming.

We often see this with hardware devices which encode RTMP directly on the device, e.g. a camera. As live streaming is usually not the expertise of these hardware manufacturers they generally don’t configure their RTMP encoder correctly, which results in a messy RTMP output.

With transcoding you can fix these incorrect configurations and vastly improve the playback experience.

What is Transmuxing?

Transmuxing allows you to change the container (e.g. mp4) of the video to something else, like HLS. The advantage of this over transcoding is that it requires much less computer power to perform, as it’s not modifying the encode at all.

The disadvantage of this approach is that it doesn’t give you the widest compatibility support that transcoding does e.g. multiple video resolutions.

How To Transcode?

The easiest way to transcode a live stream is to use a service, like us, StreamShark! It’s all built into our end-to-end live streaming platform, along with a huge range of other essential features. You can sign up for a free trial here.

The post What is Transcoding? Why is it Important? appeared first on StreamShark.

]]>
https://streamshark.io/blog/what-is-transcoding-why-is-it-important/feed/ 0
Viewing HLS Adaptive Stream Behaviour https://streamshark.io/blog/viewing-hls-adaptive-stream-behaviour/ https://streamshark.io/blog/viewing-hls-adaptive-stream-behaviour/#respond Mon, 30 May 2016 01:54:30 +0000 https://streamshark.wpengine.com/?p=2139 In this post, we will take you through the process of viewing the adaptive switching behaviour of a StreamShark stream using the Google Chrome browser. This will give you an insight into how the browser displays our html5 HLS live streams under various network conditions. You can test the playback of one of your own streams by following this tutorial and using the html5 playback URLs of your preferred stream. Enable the developer console First, open up a new Chrome browser tab. Once the tab is opened, click on the Chrome settings icon (top right) and navigate to More Tools->Developer tools. Clicking on the Developer tools option opens a panel at the bottom of the browser window. This panel contains a useful bunch of tools for software developers or anyone else interested in what is going on in the browser). Load the URL for your live stream preview Stream Setup: I’ve created a stream with two video qualities, 600kbps and 2000kbps. 64kbps AAC was selected for the sound format. Therefore, the combined audio/video stream bitrates are 664kbps and 2064kbps. Stream Status: I’ve started my broadcast software and I’m streaming to my StreamShark stream using a pre-recorded video. Load your html5 stream playback URL. If we click on the ‘network’ menu within the bottom panel, then click on ‘XHR’, we can see the network requests being made by the browser to fetch the data for stream playback. Note the HLS manifest files being loaded (*.m3u8). These manifest files tell the player which video segments should be loaded by the player (*.ts files). Click on the play icon to begin playback. The player will then commence playback, switching quality based on network conditions. My computer has enough bandwidth to play the higher 2064kbit stream, hence you can see the 2064* manifests being loaded below. Simulate different bandwidth conditions Simulate insufficient playback bandwidth We can simulate a low bandwidth network by throttling the network connection (e.g. a mobile phone connection). Click on the ‘No throttling’ dropdown to select some canned connection speeds. Let’s throttle the bandwidth right down to a terrible 2G connection (250kbps). As this is well below the 664kbps stream bandwidth, we will start to see some buffering as indicated by the loading icon (spinning circle). Note: the restricted bandwidth increases the time taken to load the segments beyond the 10s playback length of the segment. This causes buffering as the browser can’t load the segments fast enough to be able to display them in sequence. Hence, it is very important to have lower quality streams if you need to support clients with limited bandwidth. Simulate low playback bandwidth We can simulate situations where a client has a fair connection, but not enough bandwidth to play the highest quality stream. For this, select the ‘regular 3G’ option (750kbps) from the throttle dropdown. The ‘regular 3G’ option should be high enough to play the lower quality 664kbps stream but not enough to play the 204kbps. Note that the manifest and segment files loaded by the player should only be the 664kbit stream files. You’ll observe that in this case, playback no longer buffers. Verify normal playback bandwidth Flick the throttle option back to ‘No throttling’. You should see the player start loading the 2064kbit manifests and segments as it adaptively jumps to the higher bitrate stream. If we click on one of the segment files we can view the HTTP headers. The X-Cache: HIT header entry here indicates the CDN is caching our segments and we have been served a segment from the cache. Summary You have now gained an insight into the playback process involved when viewing HLS streams. Picking appropriate bitrates to cover the expected bandwidth available to your clients is vital for uninterrupted viewing, especially for mobile devices. A general rule of thumb when using multiple bitrates is to ensure each additional stream uses a bitrate roughly double that of the lower quality stream. Other video parameters such as resolution, frame rate and the devices used by your audience may also need to be taken into account when choosing your bitrates. Some experimentation may also be required. We do not recommend having very similar bitrates (such as 664 and 764) as this may result in the player constantly chopping between the two bitrates if the available network bandwidth is unstable. If you have any questions or would like us to cover more topics, please leave your question/comments in the comments section below.

The post Viewing HLS Adaptive Stream Behaviour appeared first on StreamShark.

]]>
In this post, we will take you through the process of viewing the adaptive switching behaviour of a StreamShark stream using the Google Chrome browser. This will give you an insight into how the browser displays our html5 HLS live streams under various network conditions.

You can test the playback of one of your own streams by following this tutorial and using the html5 playback URLs of your preferred stream.

Enable the developer console

First, open up a new Chrome browser tab. Once the tab is opened, click on the Chrome settings icon (top right) and navigate to More Tools->Developer tools.

Select the developer tool

Clicking on the Developer tools option opens a panel at the bottom of the browser window. This panel contains a useful bunch of tools for software developers or anyone else interested in what is going on in the browser).

Load the URL for your live stream preview

Stream Setup: I’ve created a stream with two video qualities, 600kbps and 2000kbps. 64kbps AAC was selected for the sound format. Therefore, the combined audio/video stream bitrates are 664kbps and 2064kbps.

Stream Status: I’ve started my broadcast software and I’m streaming to my StreamShark stream using a pre-recorded video.

Load your html5 stream playback URL. If we click on the ‘network’ menu within the bottom panel, then click on ‘XHR’, we can see the network requests being made by the browser to fetch the data for stream playback. Note the HLS manifest files being loaded (*.m3u8). These manifest files tell the player which video segments should be loaded by the player (*.ts files). Click on the play icon to begin playback. The player will then commence playback, switching quality based on network conditions. My computer has enough bandwidth to play the higher 2064kbit stream, hence you can see the 2064* manifests being loaded below.

Select the manifest file

Simulate different bandwidth conditions

Simulate insufficient playback bandwidth

We can simulate a low bandwidth network by throttling the network connection (e.g. a mobile phone connection). Click on the ‘No throttling’ dropdown to select some canned connection speeds.

View the throttle options

Let’s throttle the bandwidth right down to a terrible 2G connection (250kbps). As this is well below the 664kbps stream bandwidth, we will start to see some buffering as indicated by the loading icon (spinning circle). Note: the restricted bandwidth increases the time taken to load the segments beyond the 10s playback length of the segment. This causes buffering as the browser can’t load the segments fast enough to be able to display them in sequence. Hence, it is very important to have lower quality streams if you need to support clients with limited bandwidth.

Throttle the connection to 2G

Simulate low playback bandwidth

We can simulate situations where a client has a fair connection, but not enough bandwidth to play the highest quality stream. For this, select the ‘regular 3G’ option (750kbps) from the throttle dropdown. The ‘regular 3G’ option should be high enough to play the lower quality 664kbps stream but not enough to play the 204kbps. Note that the manifest and segment files loaded by the player should only be the 664kbit stream files. You’ll observe that in this case, playback no longer buffers.

Throttle the connection to 3G

Verify normal playback bandwidth

Flick the throttle option back to ‘No throttling’. You should see the player start loading the 2064kbit manifests and segments as it adaptively jumps to the higher bitrate stream. If we click on one of the segment files we can view the HTTP headers. The X-Cache: HIT header entry here indicates the CDN is caching our segments and we have been served a segment from the cache.

Stream is being served from the CDN

Summary

You have now gained an insight into the playback process involved when viewing HLS streams. Picking appropriate bitrates to cover the expected bandwidth available to your clients is vital for uninterrupted viewing, especially for mobile devices.

A general rule of thumb when using multiple bitrates is to ensure each additional stream uses a bitrate roughly double that of the lower quality stream. Other video parameters such as resolution, frame rate and the devices used by your audience may also need to be taken into account when choosing your bitrates. Some experimentation may also be required.

We do not recommend having very similar bitrates (such as 664 and 764) as this may result in the player constantly chopping between the two bitrates if the available network bandwidth is unstable.

If you have any questions or would like us to cover more topics, please leave your question/comments in the comments section below.

The post Viewing HLS Adaptive Stream Behaviour appeared first on StreamShark.

]]>
https://streamshark.io/blog/viewing-hls-adaptive-stream-behaviour/feed/ 0