Adaptive bitrate streaming: Difference between revisions
m expanded section to include streaming audio as well as video |
Undid revision 1260914759 by 2603:8080:9600:DB10:959D:1815:1C9F:D58A (talk) |
||
(78 intermediate revisions by 52 users not shown) | |||
Line 1: | Line 1: | ||
{{Short description|Streaming media technique}} |
|||
{{Use dmy dates|date= |
{{Use dmy dates|date=September 2022}} |
||
[[File:Adaptive streaming overview daseddon 2011 07 28.png|thumb|upright=2|Adaptive streaming overview]] |
|||
[[File:Adaptive streaming overview bit rates 2011 07 28.png|thumb|upright=2|Adaptive streaming in action]] |
[[File:Adaptive streaming overview daseddon 2011 07 28.png|thumb|upright=2|Adaptive streaming overview]] [[File:Adaptive streaming overview bit rates 2011 07 28.png|thumb|upright=2|Adaptive streaming in action]] |
||
⚫ | |||
'''Adaptive bitrate streaming''' is a technique used in [[streaming multimedia]] over [[computer network]]s. |
|||
⚫ | |||
⚫ | While in the past most video or audio streaming technologies utilized streaming protocols such as [[Real-time Transport Protocol|RTP]] with [[RTSP]], today's adaptive streaming technologies are based almost exclusively on [[HTTP]],<ref>{{cite conference |title=An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP |author1=Saamer Akhshabi |author2=Ali C. Begen |author3=Constantine Dovrolis |year=2011 |conference=In Proceedings of the second annual ACM conference on Multimedia systems (MMSys '11) |publisher=ACM |location=New York, NY, USA }}</ref> and are designed to work efficiently over large distributed HTTP networks. |
||
More specifically, and as the implementations in use today are, adaptive bitrate streaming is a method of video streaming over HTTP where the source content is encoded at multiple bit rates, then each of the different bit rate streams are segmented into small multi-second parts.<ref name="dataset">{{cite web|url=http://www-itec.uni-klu.ac.at/bib/files/p89-lederer.pdf |title=mmsys2012-final36.pdf |format=PDF |date= |accessdate=2017-12-16}}</ref> The streaming client is made aware of the available streams at differing bit rates, and segments of the streams by a manifest file. When starting, the client requests the segments from the lowest bit rate stream. If the client finds the download speed is greater than the bit rate of the segment downloaded, then it will request the next higher bit rate segments. Later, if the client finds the download speed for a segment is lower than the bit rate for the segment, and therefore the network throughput has deteriorated, then it will request a lower bit rate segment. The segment size can vary depending on the particular implementation, but they are typically between two (2) and ten (10) seconds.<ref name="mobileval"/><ref name="dataset"/> |
|||
⚫ | Adaptive bitrate streaming works by detecting a user's [[Bandwidth (computing)|bandwidth]] and [[CPU]] capacity in real time, adjusting the quality of the media stream accordingly.<ref>[https://ieeexplore.ieee.org/document/8424813/ A. Bentaleb, B. Taani, A. Begen, C. Timmermer, and R. Zimmermann, "A Survey on Bitrate Adaptation Schemes for Streaming Media over HTTP", In IEEE Communications Surveys & (IEEE COMST), Volume 1 Issue 1, pp. 1-1, 2018.]</ref> It requires the use of an [[Encode/Decode|encoder]] which encodes a single source media (video or audio) at multiple [[bit rate]]s. The player client<ref name="itec-dash">[http://www-itec.uni-klu.ac.at/dash/ DASH at ITEC, VLC Plugin, DASHEncoder and Dataset] by C. Mueller, S. Lederer, C. Timmerer</ref> switches between streaming the different encodings depending on available resources.<ref name="mobileval">{{cite web|url=http://www-itec.uni-klu.ac.at/bib/files/p37-mueller.pdf |title=Proceedings Template – WORD |accessdate=2017-12-16}}</ref> This results in providing very little [[Data buffer|buffering]], faster start times and a good experience for both high-end and low-end connections.<ref>{{cite web |author=Gannes, Liz |title=The Next Big Thing in Video: Adaptive Bitrate Streaming |url=http://pro.gigaom.com/2009/06/how-to-deliver-as-much-video-as-users-can-take/ |date=10 June 2009 |accessdate=1 June 2010 |archive-url=https://web.archive.org/web/20100619225207/http://pro.gigaom.com/2009/06/how-to-deliver-as-much-video-as-users-can-take/ |archive-date=19 June 2010 |url-status=dead }}</ref> |
||
More specifically, adaptive bitrate streaming is a method of video streaming over HTTP where the source content is encoded at multiple bit rates. Each of the different bit rate streams are segmented into small multi-second parts.<ref name="dataset">{{cite web|url=http://www-itec.uni-klu.ac.at/bib/files/p89-lederer.pdf |title=mmsys2012-final36.pdf |accessdate=2017-12-16}}</ref> The segment size can vary depending on the particular implementation, but they are typically between two and ten seconds.<ref name="mobileval" /><ref name="dataset" /> First, the client downloads a [[manifest file]] that describes the available stream segments and their respective bit rates. During stream start-up, the client usually requests the segments from the lowest bit rate stream. If the client finds that the network throughput is greater than the bit rate of the downloaded segment, then it will request a higher bit rate segment. Later, if the client finds that the network throughput has deteriorated, it will request a lower bit rate segment. An adaptive bitrate (ABR) algorithm in the client performs the key function of deciding which bit rate segments to download, based on the current state of the network. Several types of ABR algorithms are in commercial use: [[throughput]]-based algorithms use the throughput achieved in recent prior downloads for decision-making (e.g., throughput rule in [https://reference.dashif.org/dash.js dash.js]), buffer-based algorithms use only the client's current buffer level (e.g., BOLA<ref>{{cite journal | title= BOLA: Near-optimal bitrate adaptation for online videos. IEEE INFOCOM, 2016, by Spiteri, Urgaonkar, and Sitaraman, IEEE INFOCOM, April 2016.| arxiv= 1601.06748| last1= Spiteri| first1= Kevin| last2= Urgaonkar| first2= Rahul| last3= Sitaraman| first3= Ramesh K.| year= 2016| doi= 10.1109/TNET.2020.2996964| s2cid= 219792107}}</ref> in [https://reference.dashif.org/dash.js dash.js]), and hybrid algorithms combine both types of information (e.g., DYNAMIC<ref>{{cite web|url=https://people.cs.umass.edu/~ramesh/Site/PUBLICATIONS_files/abr-dashjs.pdf|title=From Theory to Practice: Improving Bitrate Adaptation in the DASH Reference Player, by Spiteri, Sitaraman and Sparacio, ACM Multimedia Systems Conference, June 2018.}}</ref> in [https://reference.dashif.org/dash.js dash.js]). |
|||
==Current uses== |
==Current uses== |
||
[[Post-production]] houses, [[content delivery network]]s and studios use adaptive bit rate technology in order to provide consumers with higher quality video using less manpower and fewer resources. The creation of multiple video outputs, particularly for adaptive bit rate streaming, adds great value to consumers.<ref>{{cite web |author=Marshall, Daniel |title=Show Report: Video Processing Critical to Digital Asset Management |publisher=Elemental Technologies |url=http://www.elementaltechnologies.com/blog/show-report-video-processing-critical-digital-asset-management |date=18 February 2010 |accessdate=15 October 2011}}</ref> If the technology is working properly, the end user or consumer's content should play back without interruption and potentially go unnoticed. Media companies have been actively using adaptive bit rate technology for many years now and it has essentially become standard practice for high-end streaming providers; permitting little buffering when streaming high-resolution feeds (begins with low-resolution and climbs). |
[[Post-production]] houses, [[content delivery network]]s and studios use adaptive bit rate technology in order to provide consumers with higher quality video using less manpower and fewer resources. The creation of multiple video outputs, particularly for adaptive bit rate streaming, adds great value to consumers.<ref>{{cite web |author=Marshall, Daniel |title=Show Report: Video Processing Critical to Digital Asset Management |publisher=Elemental Technologies |url=http://www.elementaltechnologies.com/blog/show-report-video-processing-critical-digital-asset-management |date=18 February 2010 |accessdate=15 October 2011 |archive-url=https://web.archive.org/web/20111004213015/http://www.elementaltechnologies.com/blog/show-report-video-processing-critical-digital-asset-management |archive-date=4 October 2011 |url-status=dead }}</ref> If the technology is working properly, the end user or consumer's content should play back without interruption and potentially go unnoticed. Media companies have been actively using adaptive bit rate technology for many years now and it has essentially become standard practice for high-end streaming providers; permitting little buffering when streaming high-resolution feeds (begins with low-resolution and climbs). |
||
==Benefits of adaptive bitrate streaming== |
==Benefits of adaptive bitrate streaming== |
||
{{unreferenced section|date=March 2013}} |
{{unreferenced section|date=March 2013}} |
||
Traditional server-driven adaptive bitrate streaming provides consumers of streaming media with the best-possible experience, since the media server automatically adapts to any changes in each user's network and playback conditions. The media and entertainment industry also benefit from adaptive bitrate streaming. As the video space grows, content delivery networks and video providers can provide customers with a superior viewing experience. Adaptive bitrate technology requires additional [[ |
Traditional server-driven adaptive bitrate streaming provides consumers of streaming media with the best-possible experience, since the media server automatically adapts to any changes in each user's network and playback conditions.<ref>{{cite journal |last1=Seufert |first1=Michael |last2=Egger |first2=Sebastian |last3=Slanina |first3=Martin |last4=Zinner |first4=Thomas |last5=Hoßfeld |first5=Tobias |last6=Tran-Gia |first6=Phuoc |title=A Survey on Quality of Experience of HTTP Adaptive Streaming |journal=IEEE Communications Surveys & Tutorials |date=2015 |volume=17 |issue=1 |pages=469–492 |doi=10.1109/COMST.2014.2360940 |s2cid=18220375 |url=https://opus.bibliothek.uni-augsburg.de/opus4/frontdoor/index/index/docId/107128 }}</ref> The media and entertainment industry also benefit from adaptive bitrate streaming. As the video space grows, content delivery networks and video providers can provide customers with a superior viewing experience. Adaptive bitrate technology requires additional [[encoding]], but simplifies the overall workflow and creates better results. |
||
HTTP-based adaptive bitrate streaming technologies yield additional benefits over traditional server-driven adaptive bitrate streaming. First, since the streaming technology is built on top of [[HTTP]], contrary to RTP-based adaptive streaming, the packets have no difficulties traversing firewall and [[Network Address Translation|NAT]] devices. Second, since HTTP streaming is purely client-driven, all adaptation logic resides at the client. This reduces the requirement of persistent connections between server and client application. Furthermore, the server is not required to maintain session state information on each client, increasing scalability. Finally, existing HTTP delivery infrastructure, such as HTTP caches and servers can be seamlessly adopted.<ref name="cc.gatech.edu">{{cite journal |author1=Saamer Akhshabi |author2=Ali C. Begen |author3=Constantine Dovrolis |title=An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP | |
HTTP-based adaptive bitrate streaming technologies yield additional benefits over traditional server-driven adaptive bitrate streaming. First, since the streaming technology is built on top of [[HTTP]], contrary to [[Real-time Transport Protocol|RTP]]-based adaptive streaming, the packets have no difficulties traversing firewall and [[Network Address Translation|NAT]] devices. Second, since HTTP streaming is purely client-driven, all adaptation logic resides at the client. This reduces the requirement of persistent connections between server and client application. Furthermore, the server is not required to maintain session state information on each client, increasing scalability. Finally, existing HTTP delivery infrastructure, such as HTTP caches and servers can be seamlessly adopted.<ref name="cc.gatech.edu">{{cite journal |author1=Saamer Akhshabi |author2=Ali C. Begen |author3=Constantine Dovrolis |title=An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP |url=http://www.cc.gatech.edu/~sakhshab/Saamer_MMSys11.pdf |accessdate=15 October 2011 |archive-url=https://web.archive.org/web/20111017140109/http://www.cc.gatech.edu/~sakhshab/Saamer_MMSys11.pdf |archive-date=17 October 2011 |url-status=dead }}</ref><ref>{{cite journal |author1=Anthony Vetro |title=The MPEG-DASH Standard for Multimedia Streaming Over the Internet | url=http://nsl.cs.sfu.ca/teaching/13/880/DASH-mm-2011.pdf |accessdate=10 July 2015}}</ref><ref>{{cite journal |author1=Jan Ozer |title=What Is Adaptive Streaming? | url=https://www.streamingmedia.com/Articles/Editorial/What-Is-.../What-Is-Adaptive-Streaming-75195.aspx |accessdate=10 July 2015|date=2011-04-28 }}</ref><ref>{{cite journal |author1=Jeroen Famaey |author2=Steven Latré |author3=Niels Bouten |author4=Wim Van de Meerssche |author5=Bart de Vleeschauwer |author6=Werner Van Leekwijck |author7=Filip De Turck |title=On the merits of SVC-based HTTP Adaptive Streaming |pages=419–426 | url=https://ieeexplore.ieee.org/document/6573013 |accessdate=10 July 2015|date=May 2013 }}</ref> |
||
A scalable [[Content delivery network|CDN]] is used to deliver media streaming to an Internet audience. The CDN receives the stream from the source at its Origin server, then replicates it to many or all of its [[edge server|Edge cache server]]s. The end-user requests the stream and is redirected to the "closest" Edge server. This can be tested using [[libdash]]<ref name="libdash">[http://www.bitmovin.net/libdash libdash: Open-source DASH client library] by bitmovin</ref> and the Distributed DASH (D-DASH) dataset,<ref name=ddash>{{cite web|url=http://www-itec.uni-klu.ac.at/dash/?page_id=958 |title=Distributed DASH Datset | ITEC – Dynamic Adaptive Streaming over HTTP |publisher=Itec.uni-klu.ac.at |
A scalable [[Content delivery network|CDN]] is used to deliver media streaming to an Internet audience. The CDN receives the stream from the source at its Origin server, then replicates it to many or all of its [[edge server|Edge cache server]]s. The end-user requests the stream and is redirected to the "closest" Edge server. This can be tested using [[libdash]]<ref name="libdash">[http://www.bitmovin.net/libdash libdash: Open-source DASH client library] by bitmovin</ref> and the Distributed DASH (D-DASH) dataset,<ref name=ddash>{{cite web|url=http://www-itec.uni-klu.ac.at/dash/?page_id=958 |title=Distributed DASH Datset | ITEC – Dynamic Adaptive Streaming over HTTP |publisher=Itec.uni-klu.ac.at |accessdate=2017-12-16}}</ref> which has several mirrors across Europe, Asia and the US. The use of HTTP-based adaptive streaming allows the Edge server to run a simple HTTP server software, whose licence cost is cheap or free, reducing software licensing cost, compared to costly media server licences (e.g. Adobe Flash Media Streaming Server). The CDN cost for HTTP streaming media is then similar to HTTP web caching CDN cost. |
||
==History== |
==History== |
||
Adaptive bit rate over HTTP was created by the DVD Forum at the WG1 Special Streaming group in October 2002. The group was co-chaired |
Adaptive bit rate over HTTP was created by the DVD Forum at the WG1 Special Streaming group in October 2002. The group was co-chaired by [[Toshiba]] and [[Phoenix Technologies]], The expert group count with the collaboration of [[Microsoft]], [[Apple Computer]], [[DTS Inc.]], [[Warner Brothers]], [[20th Century Fox]], [[Digital Deluxe]], [[Disney]], [[Macromedia]] and [[Akamai]].{{Dubious|date=February 2016}}{{citation needed|date=December 2012}} The technology was originally called DVDoverIP and was an integral effort of the DVD ENAV book.<ref>{{citation |url=http://www.dvdforum.org/tech-dvdbook.htm |title=DVD Book Construction |date=May 2005 |publisher=DVD Forum}}</ref> The concept came from storing MPEG-1 and MPEG-2 DVD TS Sectors into small 2KB files, which will be served using an HTTP server to the player. The MPEG-1 segments provided the lower bandwidth stream, while the MPEG-2 provided a higher bit rate stream. The original XML schema provided a simple playlist of bit rates, languages and url servers. The first working prototype was presented to the DVD Forum by Phoenix Technologies at the [[Harman Kardon]] Lab in Villingen Germany.{{citation needed|date=December 2012}} |
||
==Implementations== |
==Implementations== |
||
Adaptive bit rate streaming was introduced by Move Networks and is now being developed and utilized by [[Adobe Systems]], [[Apple Computer|Apple]], [[Microsoft]] and [[Octoshape]].<ref>{{cite web |author=Gannes, Liz |title=The Lowdown on Apple's HTTP Adaptive Bitrate Streaming |url=http://newteevee.com/2009/06/10/the-lowdown-on-apples-http-adaptive-bitrate-streaming/ |date=10 June 2009 |accessdate=24 June 2010| |
Adaptive bit rate streaming was introduced by Move Networks in 2006 <ref>{{Cite journal |last=Yang |first=Hongyun |date=2014 |title=Opportunities and Challenges of HTTP Adaptive Streaming |url=https://article.nadiapub.com/IJFGCN/vol7_no6/16.pdf |journal=International Journal of Future Generation Communication and Networking |volume=7 |issue=6 |pages=165-180}}</ref> and is now being developed and utilized by [[Adobe Systems]], [[Apple Computer|Apple]], [[Microsoft]] and [[Octoshape]].<ref>{{cite web |author=Gannes, Liz |title=The Lowdown on Apple's HTTP Adaptive Bitrate Streaming |url=http://newteevee.com/2009/06/10/the-lowdown-on-apples-http-adaptive-bitrate-streaming/ |date=10 June 2009 |accessdate=24 June 2010| archive-url= https://web.archive.org/web/20100619175521/http://newteevee.com/2009/06/10/the-lowdown-on-apples-http-adaptive-bitrate-streaming/| archive-date= 19 June 2010 | url-status= live}}</ref> In October 2010, Move Networks was awarded a patent for their adaptive bit rate streaming (US patent number 7818444).<ref>{{cite web |url=http://gigaom.com/video/move-gets-streaming-patent-are-adobe-apple-hosed-2/ |title=Move Gets Streaming Patent; Are Adobe & Apple Hosed? – Online Video News |publisher=Gigaom.com |date=15 September 2010 |accessdate=15 October 2011 |archive-url=https://web.archive.org/web/20111022155719/http://gigaom.com/video/move-gets-streaming-patent-are-adobe-apple-hosed-2/ |archive-date=22 October 2011 |url-status=dead }}</ref> |
||
=== |
===Dynamic Adaptive Streaming over HTTP (DASH)=== |
||
{{main article|Dynamic Adaptive Streaming over HTTP}} |
{{main article|Dynamic Adaptive Streaming over HTTP}} |
||
MPEG-DASH is the only adaptive bit-rate HTTP-based streaming solution that is an international standard<ref name=MPEGPressRelease>{{cite news|title=MPEG ratifies its draft standard for DASH |publisher=MPEG |url=http://mpeg.chiariglione.org/meetings/geneva11-1/geneva_press.htm |date=2 December 2011 |accessdate=26 August 2012 | |
Dynamic Adaptive Streaming over HTTP (DASH), also known as MPEG-DASH, is the only adaptive bit-rate HTTP-based streaming solution that is an international standard<ref name=MPEGPressRelease>{{cite news|title=MPEG ratifies its draft standard for DASH |publisher=MPEG |url=http://mpeg.chiariglione.org/meetings/geneva11-1/geneva_press.htm |date=2 December 2011 |accessdate=26 August 2012 |url-status=dead |archive-url=https://web.archive.org/web/20120820233136/http://mpeg.chiariglione.org/meetings/geneva11-1/geneva_press.htm |archive-date=20 August 2012 }}</ref> |
||
MPEG-DASH technology was developed under [[MPEG]]. Work on DASH started in 2010 |
MPEG-DASH technology was developed under [[MPEG]]. Work on DASH started in 2010 and became a Draft International Standard in January 2011 and an International Standard in November 2011.<ref name=MPEGPressRelease/><ref name="timmerer-1">{{cite web|last=Timmerer |first=Christian |url=http://multimediacommunication.blogspot.com/2010/05/http-streaming-of-mpeg-media.html |title=HTTP streaming of MPEG media – blog entry |publisher=Multimediacommunication.blogspot.com |date=2012-04-26 |accessdate=2017-12-16}}</ref><ref>{{cite web|url=http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=57623 |title=ISO/IEC DIS 23009-1.2 Dynamic adaptive streaming over HTTP (DASH) |publisher=Iso.org |accessdate=2017-12-16}}</ref> The MPEG-DASH standard was published as [http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=57623 ISO/IEC 23009-1:2012] in April, 2012. |
||
MPEG-DASH is a technology related to [[Adobe Systems]] [[ |
MPEG-DASH is a technology related to [[Adobe Systems]] [[#Adobe Dynamic Streaming for Flash|HTTP Dynamic Streaming]], [[Apple Inc.]] [[HTTP Live Streaming]] (HLS) and [[Microsoft]] [[#Microsoft Smooth Streaming|Smooth Streaming]].<ref name="timmerer-2">[http://multimediacommunication.blogspot.com/2011/02/dynamic-adaptive-streaming-over-http.html Updates on DASH] – blog entry</ref> DASH is based on Adaptive HTTP streaming (AHS) in [[3GPP]] Release 9 and on HTTP Adaptive Streaming (HAS) in [[Open IPTV Forum]] Release 2.<ref name="3GPP">ETSI 3GPP [http://www.3gpp.org/ftp/Specs/html-info/26247.htm 3GPP TS 26.247; Transparent end-to-end packet-switched streaming service (PSS); Progressive Download and Dynamic Adaptive Streaming over HTTP (3GP-DASH)]</ref> |
||
As part of their collaboration with MPEG, 3GPP Release 10 has adopted DASH (with specific codecs and operating modes) for use over wireless networks.<ref name="3GPP" /> |
As part of their collaboration with MPEG, 3GPP Release 10 has adopted DASH (with specific codecs and operating modes) for use over wireless networks.<ref name="3GPP" /> |
||
The goal of standardizing an adaptive streaming solution is to assure the market that the solution can work universally, unlike other solutions that are more specific to certain vendors, such as Apple’s HLS, Microsoft’s Smooth Streaming, or Adobe’s HDS. |
|||
Available implementations are the HTML5-based bitdash MPEG-DASH player<ref>{{cite web|url=http://www.dash-player.com |title=bitdash HTML5 MPEG-DASH player |publisher=Dash-player.com |date= |accessdate=2017-12-16}}</ref> as well as the open source C++-based DASH client access library [[libdash]] of bitmovin GmbH,<ref name="libdash"/> the DASH tools of the Institute of Information Technology (ITEC) at Alpen-Adria University Klagenfurt,<ref name="itec-dash"/><ref name="vlc-dash-paper">{{cite web|url=http://www-itec.uni-klu.ac.at/bib/files/p723-muller.pdf |title=A VLC media player plugin enabling dynamic adaptive streaming over HTTP |
Available implementations are the HTML5-based ''bitdash'' MPEG-DASH player<ref>{{cite web |url=http://www.dash-player.com |title=bitdash HTML5 MPEG-DASH player |publisher=Dash-player.com |date=2016-01-22 |accessdate=2017-12-16 |archive-date=10 July 2016 |archive-url=https://web.archive.org/web/20160710145025/http://www.dash-player.com/ |url-status=dead }}</ref> as well as the open source C++-based DASH client access library [[libdash]] of bitmovin GmbH,<ref name="libdash"/> the DASH tools of the Institute of Information Technology (ITEC) at Alpen-Adria University Klagenfurt,<ref name="itec-dash"/><ref name="vlc-dash-paper">{{cite web|url=http://www-itec.uni-klu.ac.at/bib/files/p723-muller.pdf |title=A VLC media player plugin enabling dynamic adaptive streaming over HTTP |accessdate=2017-12-16}}</ref> the multimedia framework of the GPAC group at Telecom ParisTech,<ref name="GPAC Telekom ParisTech">{{Cite web |url=http://gpac.wp.institut-telecom.fr/2011/02/02/mp4box-fragmentation-segmentation-splitting-and-interleaving/ |title=GPAC Telecom ParisTech |access-date=28 March 2013 |archive-url=https://web.archive.org/web/20120224084539/http://gpac.wp.institut-telecom.fr/2011/02/02/mp4box-fragmentation-segmentation-splitting-and-interleaving/ |archive-date=24 February 2012 |url-status=dead }}</ref> and the dash.js<ref>{{cite web|url=https://github.com/Dash-Industry-Forum/dash.js |title=dash.js |publisher=Github.com |accessdate=2017-12-16}}</ref> player of the [[DASH-IF]]. |
||
=== |
===Apple HTTP Live Streaming (HLS)=== |
||
⚫ | "HTTP Dynamic streaming is the process of efficiently delivering streaming video to users by dynamically switching among different streams of varying quality and size during playback. This provides users with the best possible viewing experience their bandwidth and local computer hardware ([[ |
||
⚫ | The latest versions of Flash Player and Flash Media Server support adaptive bit-rate streaming over the traditional [[Real Time Messaging Protocol|RTMP]] protocol, as well as [[ |
||
⚫ | |||
{{Main article|HTTP Live Streaming}} |
{{Main article|HTTP Live Streaming}} |
||
HTTP Live Streaming (HLS) is an HTTP-based media streaming communications protocol implemented by [[Apple Inc.]] as part of [[QuickTime X|QuickTime X]] and [[iOS]]. HLS supports both live and [[Video on demand]] content. It works by breaking down streams or |
HTTP Live Streaming (HLS) is an HTTP-based media streaming communications protocol implemented by [[Apple Inc.]] as part of [[QuickTime X|QuickTime X]] and [[iOS]]. HLS supports both live and [[Video on demand]] content. It works by breaking down media streams or files into short pieces (media segments) which are stored as [[MPEG-TS]] or [[MP4_file_format|fragmented MP4]] files. This is typically done at multiple bitrates using a stream or file segmenter application, also known as a packager. One such segmenter implementation is provided by Apple.<ref>{{citation |title=Mac Developer Library |publisher=Apple |url=https://developer.apple.com/library/mac/documentation/networkinginternet/conceptual/streamingmediaguide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html |access-date=2 June 2014}}</ref> Additional packagers are available, including free / open source offerings like Google's Shaka Packager <ref>{{citation |title=Shaka Packager Github Repository |publisher=Google |url=https://github.com/shaka-project/shaka-packager |access-date=3 January 2023}}</ref> and various commercial tools as well - such as Unified Streaming.<ref>{{citation |title=Unified Streaming |publisher=Unified Streaming |url=https://www.unified-streaming.com/ |access-date=3 January 2023}}</ref> The segmenter is also responsible for producing a set of playlist files in the M3U8 format which describe the media chunks. Each playlist is specific to a given bitrate, and contains the relative or absolute URLs to the chunks for that bitrate. The client is then responsible for requesting the appropriate playlist depending on available bandwidth. |
||
HTTP Live Streaming is a standard feature in the iPhone 3.0 and newer versions.<ref>{{cite web |url=http://www.appleinsider.com/articles/09/07/08/apple_launches_http_live_streaming_standard_in_iphone_3_0.html |publisher=AppleInsider |title=Apple launches HTTP Live Streaming standard in iPhone 3.0 |author=Prince McLean |date=9 July 2009 |accessdate=15 October 2011}}</ref> |
HTTP Live Streaming is a standard feature in the iPhone 3.0 and newer versions.<ref>{{cite web |url=http://www.appleinsider.com/articles/09/07/08/apple_launches_http_live_streaming_standard_in_iphone_3_0.html |publisher=AppleInsider |title=Apple launches HTTP Live Streaming standard in iPhone 3.0 |author=Prince McLean |date=9 July 2009 |accessdate=15 October 2011}}</ref> |
||
Apple has submitted its solution to the [[IETF]] for consideration as an Informational [[Request |
Apple has submitted its solution to the [[IETF]] for consideration as an Informational [[Request for Comments]].<ref>{{citation |author=R. Pantos |title=HTTP Live Streaming |url=http://tools.ietf.org/html/draft-pantos-http-live-streaming |accessdate=11 October 2011 |publisher=IETF}}</ref> This was officially accepted as {{IETF RFC|8216}} A number of [[HTTP Live Streaming#Supported players and servers|proprietary and open source solutions]] exist for both the server implementation (segmenter) and the client player. |
||
⚫ | HLS streams can be identified by the playlist URL format extension of {{mono|m3u8}} or MIME type of application/vnd.apple.mpegurl.<ref>{{cite IETF |url=https://www.rfc-editor.org/rfc/rfc8216.html#section-4 |rfc=8216 |section=4}}</ref> These adaptive streams can be made available in many different bitrates and the client device interacts with the server to obtain the best available bitrate which can reliably be delivered. |
||
Playback of HLS is supported on many platforms including Safari and native apps on macOS / iOS, Microsoft Edge on Windows 10, ExoPlayer on Android, and the Roku platform. Many Smart TVs also have native support for HLS. Playing HLS on other platforms like Chrome / Firefox is typically achieved via a browser / JavaScript player implementation. Many open source and commercial players are available including hls.js, video.js http-streaming, BitMovin, JWPlayer, THEOplayer, etc. |
|||
⚫ | |||
⚫ | "HTTP Dynamic streaming is the process of efficiently delivering streaming video to users by dynamically switching among different streams of varying quality and size during playback. This provides users with the best possible viewing experience their bandwidth and local computer hardware ([[CPU]]) can support. Another major goal of dynamic streaming is to make this process smooth and seamless to users, so that if [[Video scaler|up-scaling or down-scaling]] the quality of the stream is necessary, it is a smooth and nearly unnoticeable switch without disrupting the continuous playback."<ref>{{cite web |last=Hassoun |first=David |work=Adobe Developer Connection |publisher=Adobe Systems |title=Dynamic streaming in Flash Media Server 3.5 – Part 1: Overview of the new capabilities |url=http://www.adobe.com/devnet/flashmediaserver/articles/dynstream_advanced_pt1.html |archive-url=https://web.archive.org/web/20140330211949/http://www.adobe.com/devnet/adobe-media-server/articles/dynstream_advanced_pt1.html |archive-date=2014-03-30}}</ref> |
||
⚫ | The latest versions of Flash Player and Flash Media Server support adaptive bit-rate streaming over the traditional [[Real Time Messaging Protocol|RTMP]] protocol, as well as [[HTTP]], similar to the HTTP-based solutions from Apple and Microsoft,<ref>{{cite web |title=HTTP Dynamic Streaming |publisher=Adobe Systems |url=http://www.adobe.com/products/httpdynamicstreaming/ |accessdate=13 October 2010}}</ref> HTTP dynamic streaming being supported in Flash Player 10.1 and later.<ref>{{cite web |title= FAQ HTTP Dynamic Streaming |publisher=Adobe Systems |url=http://www.adobe.com/products/hds-dynamic-streaming/faq.html |accessdate=12 January 2015}}</ref> HTTP-based streaming has the advantage of not requiring any firewall ports being opened outside of the normal ports used by web browsers. HTTP-based streaming also allows video fragments to be [[Web cache|cached]] by browsers, proxies, and [[Content delivery network|CDNs]], drastically reducing the load on the source server. |
||
⚫ | HLS streams can be identified by the playlist URL format extension of < |
||
⚫ | |||
Playback of HLS is only natively supported in Safari on iOS and Mac and Microsoft Edge on Windows 10. Solutions for playback of HLS on other platforms mostly rely on third-party plug-ins such as Flash or QuickTime. |
|||
⚫ | Smooth Streaming is an [[Internet Information Services#Extensions|IIS Media Services extension]] that enables adaptive streaming of media to clients over HTTP.<ref>{{cite web |title=Smooth Streaming |publisher=IIS.net |url=http://www.iis.net/download/smoothstreaming |accessdate=24 June 2010| archive-url= https://web.archive.org/web/20100615150921/http://www.iis.net/download/SmoothStreaming| archive-date= 15 June 2010 | url-status= live}}</ref> The format specification is based on the [[ISO base media file format]] and standardized by Microsoft as the Protected Interoperable File Format.<ref>{{citation |author=Chris Knowlton |title=Protected Interoperable File Format |url=http://learn.iis.net/page.aspx/685/protected-interoperable-file-format |publisher=Microsoft |date=8 September 2009 |accessdate=15 October 2011}}</ref> Microsoft is actively involved with [[3GPP]], [[MPEG]] and [[DECE]] organizations' efforts to standardize adaptive bit-rate HTTP streaming. Microsoft provides Smooth Streaming Client software development kits for [[Silverlight]] and [[Windows Phone 7]], as well as a Smooth Streaming Porting Kit that can be used for other client operating systems, such as Apple iOS, Android, and Linux.<ref>{{cite web |publisher=Microsoft |title=Microsoft End-to-End Platform Powers Next-Generation Silverlight and IIS Media Experiences Across Multiple Screens |url=http://www.microsoft.com/presspass/press/2010/apr10/04-08nab2010pr.mspx |date=8 April 2010 |accessdate=30 July 2011}}</ref> IIS Media Services 4.0, released in November 2010, introduced a feature which enables Live Smooth Streaming H.264/AAC videos to be dynamically repackaged into the Apple HTTP Adaptive Streaming format and delivered to iOS devices without the need for re-encoding. |
||
⚫ | Microsoft has successfully demonstrated delivery of both live and on-demand 1080p HD video with Smooth Streaming to Silverlight clients. In 2010, Microsoft also partnered with NVIDIA to demonstrate live streaming of 1080p stereoscopic 3D video to PCs equipped with [[Nvidia 3D Vision|NVIDIA 3D Vision]] technology.<ref>{{cite web |publisher=Microsoft |title=First Day of IBC |url=http://team.silverlight.net/announcement/first-day-of-ibc/ |accessdate=22 January 2011 |archive-url=https://web.archive.org/web/20110202053648/http://team.silverlight.net/announcement/first-day-of-ibc/ |archive-date=2 February 2011 |url-status=dead }}</ref> |
||
===Common Media Application Format (CMAF)=== |
|||
⚫ | |||
CMAF is a presentation container format used for the delivery of both HLS and MPEG-DASH. Hence it is intended to simplify delivery of HTTP-based streaming media. It was proposed in 2016 by Apple and Microsoft and officially published in 2018.<ref>{{cite web|author1=Traci Ruether|title=What Is CMAF?|url=https://www.wowza.com/blog/what-is-cmaf|date=2019-01-23|accessdate=2022-01-13}}</ref> |
|||
⚫ | Smooth Streaming is an [[Internet Information Services#Extensions|IIS Media Services extension]] that enables adaptive streaming of media to clients over HTTP.<ref>{{cite web |title=Smooth Streaming |publisher=IIS.net |url=http://www.iis.net/download/smoothstreaming |accessdate=24 June 2010| |
||
⚫ | Microsoft has successfully demonstrated delivery of both live and on-demand 1080p HD video with Smooth Streaming to Silverlight clients. In 2010, Microsoft also partnered with NVIDIA to demonstrate live streaming of 1080p stereoscopic 3D video to PCs equipped with [[Nvidia 3D Vision|NVIDIA 3D Vision]] technology.<ref>{{cite web |publisher=Microsoft |title=First Day of IBC |url=http://team.silverlight.net/announcement/first-day-of-ibc/ |accessdate=22 |
||
===QuavStreams Adaptive Streaming over HTTP=== |
===QuavStreams Adaptive Streaming over HTTP=== |
||
QuavStreams Adaptive Streaming is a multimedia streaming technology developed by Quavlive. The streaming server is an HTTP server that has multiple versions of each video, encoded at different bitrates and resolutions. The server delivers the encoded video/audio frames switching from one level to another, according to the current available bandwidth. The control is entirely server-based, so the client does not need special additional features. The streaming control employs feedback control theory.<ref>{{cite web|author1=Luca De Cicco | author2=Saverio Mascolo | author3=Vittorio Palmisano |title=Feedback Control for Adaptive Live Video Streaming|url=http://c3lab.poliba.it/images/6/6d/MMSYS2011.pdf|publisher=MMSYS2011|accessdate=9 September 2012}}</ref> Currently, QuavStreams supports H.264/MP3 codecs muxed into the FLV container and VP8/Vorbis codecs [[muxed]] into the WEBM container. |
QuavStreams Adaptive Streaming is a multimedia streaming technology developed by Quavlive. The streaming server is an HTTP server that has multiple versions of each video, encoded at different bitrates and resolutions. The server delivers the encoded video/audio frames switching from one level to another, according to the current available bandwidth. The control is entirely server-based, so the client does not need special additional features. The streaming control employs feedback control theory.<ref>{{cite web|author1=Luca De Cicco | author2=Saverio Mascolo | author3=Vittorio Palmisano |title=Feedback Control for Adaptive Live Video Streaming|url=http://c3lab.poliba.it/images/6/6d/MMSYS2011.pdf|publisher=MMSYS2011|accessdate=9 September 2012}}</ref> Currently, QuavStreams supports H.264/MP3 codecs muxed into the FLV container and VP8/Vorbis codecs [[muxed]] into the WEBM container. |
||
=== |
===Uplynk=== |
||
[[Uplynk]] delivers HD adaptive bitrate streaming to multiple platforms, including iOS, Android, Windows Mac, Linux, and Roku, across various browser combinations, by encoding video in the cloud using a single non-proprietary adaptive streaming format. Rather than streaming and storing multiple formats for different platforms and devices, Uplynk stores and streams only one. The first studio to use this technology for delivery was [[Disney–ABC Television Group]], using it for video encoding for web, mobile and tablet streaming apps on the ABC Player, ABC Family and Watch Disney apps, as well as the live Watch Disney Channel, Watch Disney Junior, and Watch Disney XD.<ref>{{cite news|author=Dean Takahashi |url=https://venturebeat.com/2013/01/16/uplynk-creates-a-cheap-and-efficient-way-for-disney-to-stream-videos/ |title=Uplynk creates a cheap and efficient way for Disney to stream videos |work=VentureBeat |date=2013-01-16 |accessdate=2017-12-16}}</ref><ref>{{cite web|last=Dreier |first=Troy |url=http://www.streamingmedia.com/Articles/News/Online-Video-News/UpLynk-Emerges-from-Stealth-Mode%3B-Disney-ABC-Is-First-Customer-87154.aspx |title=UpLynk Emerges from Stealth Mode; DisneyABC Is First Customer – Streaming Media Magazine |publisher=Streamingmedia.com |date=2013-01-16 |accessdate=2017-12-16}}</ref> |
|||
===Self-learning clients=== |
===Self-learning clients=== |
||
In recent years, the benefits of self-learning algorithms in adaptive bitrate streaming have been investigated in academia. While most of the initial self-learning approaches are implemented at the server-side<ref>{{cite journal |author1=Y. Fei |author2=V. W. S. Wong |author3=V. C. M. Leung |title=Efficient QoS provisioning for adaptive multimedia in mobile communication networks by reinforcement learning | date=2006 | journal=Mobile Networks and Applications | volume=11 | issue=1 | pages=101–110 |url= |
In recent years, the benefits of self-learning algorithms in adaptive bitrate streaming have been investigated in academia. While most of the initial self-learning approaches are implemented at the server-side<ref>{{cite journal |author1=Y. Fei |author2=V. W. S. Wong |author3=V. C. M. Leung |title=Efficient QoS provisioning for adaptive multimedia in mobile communication networks by reinforcement learning | date=2006 | journal=Mobile Networks and Applications | volume=11 | issue=1 | pages=101–110 |url=https://ieeexplore.ieee.org/document/1363846 | doi=10.1007/s11036-005-4464-2|citeseerx=10.1.1.70.1430 |s2cid=13022779 }}</ref><ref>{{cite journal |author1=V. Charvillat |author2=R. Grigoras |title=Reinforcement learning for dynamic multimedia adaptation | date=2007 | journal=Journal of Network and Computer Applications | volume=30 | issue=3 | pages=1034–1058 | doi=10.1016/j.jnca.2005.12.010}}</ref><ref>{{cite journal |author1=D. W. McClary |author2=V. R. Syrotiuk |author3=V. Lecuire |title=Adaptive audio streaming in mobile ad hoc networks using neural networks | date=2008 | journal=Ad Hoc Networks | volume=6 | issue=4 | pages=524–538 | doi=10.1016/j.adhoc.2007.04.005}}</ref> (e.g. performing admission control using [[reinforcement learning]] or [[artificial neural network]]s), more recent research is focusing on the development of self-learning HTTP Adaptive Streaming clients. Multiple approaches have been presented in literature using the [[State-Action-Reward-State-Action|SARSA]]<ref>{{cite conference|author1=V. Menkovski |author2=A. Liotta |title=Intelligent control for adaptive video streaming | date=2013 | book-title=IEEE International Conference on Consumer Electronics (ICCE) | location=Washington, DC | pages=127–128 |doi=10.1109/ICCE.2013.6486825 }}</ref> or [[Q-learning]]<ref>{{cite journal |author1=M. Claeys |author2=S. Latré |author3=J. Famaey |author4=F. De Turck|title=Design and evaluation of a self-learning HTTP adaptive video streaming client | date=2014 | journal=IEEE Communications Letters | volume=18 | issue=4 | pages=716–719 | doi=10.1109/lcomm.2014.020414.132649|hdl=1854/LU-5733061 |s2cid=26955239 |url=https://biblio.ugent.be/publication/5733061 |hdl-access=free }}</ref> algorithm. In all of these approaches, the client state is modeled using, among others, information about the current perceived network throughput and buffer filling level. Based on this information, the self-learning client autonomously decides which quality level to select for the next video segment. The learning process is steered using feedback information, representing the [[Quality of experience|Quality of Experience (QoE)]] (e.g. based on the quality level, the number of switches and the number of video freezes). Furthermore, it was shown that multi-agent [[Q-learning]] can be applied to improve [[Fairness measure#QoE fairness|QoE fairness]] among multiple adaptive streaming clients.<ref>{{cite conference|author1=S. Petrangeli |author2=M. Claeys |author3=S. Latré |author4=J. Famaey |author5=F. De Turck |title=A multi-agent Q-Learning-based framework for achieving fairness in HTTP Adaptive Streaming | date=2014 | book-title=IEEE Network Operations and Management Symposium (NOMS) | location=Krakow | pages=1–9 |doi=10.1109/NOMS.2014.6838245 }}</ref> |
||
==Criticisms== |
==Criticisms== |
||
HTTP-based adaptive bit rate technologies are significantly more operationally complex than traditional streaming technologies. Some of the documented considerations are things such as additional storage and encoding costs, and challenges with maintaining quality globally. There have also been some interesting dynamics found around the interactions between complex adaptive bit rate logic competing with complex TCP flow control logic.<ref name="cc.gatech.edu"/><ref>{{cite web |author=Pete Mastin |title=Is adaptive bit rate the yellow brick road, or fool's gold for HD streaming? |url=http://www.fierceonlinevideo.com/story/adaptive-bit-rate-yellow-brick-road-or-fools-gold-hd-streaming/2011-01-28 |date= |
HTTP-based adaptive bit rate technologies are significantly more operationally complex than traditional streaming technologies. Some of the documented considerations are things such as additional storage and encoding costs, and challenges with maintaining quality globally. There have also been some interesting dynamics found around the interactions between complex adaptive bit rate logic competing with complex TCP flow control logic.<ref name="cc.gatech.edu"/><ref>{{cite web |author=Pete Mastin |title=Is adaptive bit rate the yellow brick road, or fool's gold for HD streaming? |url=http://www.fierceonlinevideo.com/story/adaptive-bit-rate-yellow-brick-road-or-fools-gold-hd-streaming/2011-01-28 |date=28 January 2011 |accessdate=15 October 2011 |archive-url=https://web.archive.org/web/20110907004353/http://www.fierceonlinevideo.com/story/adaptive-bit-rate-yellow-brick-road-or-fools-gold-hd-streaming/2011-01-28 |archive-date=7 September 2011 |url-status=dead }}</ref> |
||
<ref>{{cite journal | author1=Luca De Cicco | author2=Saverio Mascolo | title=An Experimental Investigation of the Akamai Adaptive Video Streaming | url=http://c3lab.poliba.it/images/d/d8/Akamai_wima2010.pdf | accessdate=29 November 2011}}</ref> |
<ref>{{cite journal | author1=Luca De Cicco | author2=Saverio Mascolo | title=An Experimental Investigation of the Akamai Adaptive Video Streaming | url=http://c3lab.poliba.it/images/d/d8/Akamai_wima2010.pdf | accessdate=29 November 2011}}</ref> |
||
<ref>{{cite journal |
<ref>{{cite journal|title=Adaptive streaming: a comparison|url=http://www.quavlive.com/en/adaptive-streaming/|access-date=17 April 2014|archive-url=https://web.archive.org/web/20140419012434/http://www.quavlive.com/en/adaptive-streaming/|archive-date=19 April 2014|url-status=dead}}</ref><ref>{{cite journal | author1=Chris Knowlton | title=Adaptive Streaming Comparison | url=http://www.iis.net/learn/media/smooth-streaming/adaptive-streaming-comparison | date=28 January 2010}}</ref> |
||
However, these criticisms have been outweighed in practice by the economics and scalability of HTTP delivery: whereas non-HTTP streaming solutions require massive deployment of specialized streaming server infrastructure, HTTP-based adaptive bit-rate streaming can leverage the same HTTP web servers used to deliver all other content over the Internet. |
However, these criticisms have been outweighed in practice by the economics and scalability of HTTP delivery: whereas non-HTTP streaming solutions require massive deployment of specialized streaming server infrastructure, HTTP-based adaptive bit-rate streaming can leverage the same HTTP web servers used to deliver all other content over the Internet.{{cn|date=May 2023}} |
||
With no single clearly defined or open standard for the digital rights management used in the above methods, there is no 100% compatible way of delivering restricted or time-sensitive content to any device or player. This also proves to be a problem with digital rights management being employed by any streaming protocol. |
With no single clearly defined or open standard for the [[digital rights management]] used in the above methods, there is no 100% compatible way of delivering restricted or time-sensitive content to any device or player. This also proves to be a problem with digital rights management being employed by any streaming protocol. |
||
The method of segmenting files into smaller files used by some implementations (as used by [[HTTP Live Streaming]]) could be deemed unnecessary due to the ability of HTTP clients to request byte ranges from a single video asset file that could have multiple video tracks at differing bit rates with the manifest file only indicating track number and bit rate. However, this approach allows for serving of chunks by any simple HTTP server and so therefore guarantees [[Content delivery network|CDN]] compatibility. Implementations using byte ranges such as [[Microsoft Smooth Streaming Protocol|Microsoft Smooth Streaming]] require a dedicated HTTP server such as [[Internet Information Services|IIS]] to respond to the requests for video asset chunks. |
The method of segmenting files into smaller files used by some implementations (as used by [[HTTP Live Streaming]]) could be deemed unnecessary due to the ability of HTTP clients to request byte ranges from a single video asset file that could have multiple video tracks at differing bit rates with the manifest file only indicating track number and bit rate. However, this approach allows for serving of chunks by any simple HTTP server and so therefore guarantees [[Content delivery network|CDN]] compatibility. Implementations using byte ranges such as [[Microsoft Smooth Streaming Protocol|Microsoft Smooth Streaming]] require a dedicated HTTP server such as [[Internet Information Services|IIS]] to respond to the requests for video asset chunks. |
||
== See also == |
== See also == |
||
[[Multiple description coding]] |
*[[Multiple description coding]] |
||
*[[Hierarchical modulation]] – alternative with reduced storage and authoring demands |
|||
==References== |
==References== |
||
{{reflist |
{{reflist}} |
||
==Further reading== |
==Further reading== |
||
* [http://pro.gigaom.com/2009/06/how-to-deliver-as-much-video-as-users-can-take/ The Next Big Thing in Video: Adaptive Bitrate Streaming] |
* [http://pro.gigaom.com/2009/06/how-to-deliver-as-much-video-as-users-can-take/ The Next Big Thing in Video: Adaptive Bitrate Streaming] {{Webarchive|url=https://web.archive.org/web/20100619225207/http://pro.gigaom.com/2009/06/how-to-deliver-as-much-video-as-users-can-take/ |date=19 June 2010 }} |
||
{{DEFAULTSORT:Adaptive Bit Rate}} |
{{DEFAULTSORT:Adaptive Bit Rate}} |
Latest revision as of 22:07, 3 December 2024
Adaptive bitrate streaming is a technique used in streaming multimedia over computer networks.
While in the past most video or audio streaming technologies utilized streaming protocols such as RTP with RTSP, today's adaptive streaming technologies are based almost exclusively on HTTP,[1] and are designed to work efficiently over large distributed HTTP networks.
Adaptive bitrate streaming works by detecting a user's bandwidth and CPU capacity in real time, adjusting the quality of the media stream accordingly.[2] It requires the use of an encoder which encodes a single source media (video or audio) at multiple bit rates. The player client[3] switches between streaming the different encodings depending on available resources.[4] This results in providing very little buffering, faster start times and a good experience for both high-end and low-end connections.[5]
More specifically, adaptive bitrate streaming is a method of video streaming over HTTP where the source content is encoded at multiple bit rates. Each of the different bit rate streams are segmented into small multi-second parts.[6] The segment size can vary depending on the particular implementation, but they are typically between two and ten seconds.[4][6] First, the client downloads a manifest file that describes the available stream segments and their respective bit rates. During stream start-up, the client usually requests the segments from the lowest bit rate stream. If the client finds that the network throughput is greater than the bit rate of the downloaded segment, then it will request a higher bit rate segment. Later, if the client finds that the network throughput has deteriorated, it will request a lower bit rate segment. An adaptive bitrate (ABR) algorithm in the client performs the key function of deciding which bit rate segments to download, based on the current state of the network. Several types of ABR algorithms are in commercial use: throughput-based algorithms use the throughput achieved in recent prior downloads for decision-making (e.g., throughput rule in dash.js), buffer-based algorithms use only the client's current buffer level (e.g., BOLA[7] in dash.js), and hybrid algorithms combine both types of information (e.g., DYNAMIC[8] in dash.js).
Current uses
[edit]Post-production houses, content delivery networks and studios use adaptive bit rate technology in order to provide consumers with higher quality video using less manpower and fewer resources. The creation of multiple video outputs, particularly for adaptive bit rate streaming, adds great value to consumers.[9] If the technology is working properly, the end user or consumer's content should play back without interruption and potentially go unnoticed. Media companies have been actively using adaptive bit rate technology for many years now and it has essentially become standard practice for high-end streaming providers; permitting little buffering when streaming high-resolution feeds (begins with low-resolution and climbs).
Benefits of adaptive bitrate streaming
[edit]Traditional server-driven adaptive bitrate streaming provides consumers of streaming media with the best-possible experience, since the media server automatically adapts to any changes in each user's network and playback conditions.[10] The media and entertainment industry also benefit from adaptive bitrate streaming. As the video space grows, content delivery networks and video providers can provide customers with a superior viewing experience. Adaptive bitrate technology requires additional encoding, but simplifies the overall workflow and creates better results.
HTTP-based adaptive bitrate streaming technologies yield additional benefits over traditional server-driven adaptive bitrate streaming. First, since the streaming technology is built on top of HTTP, contrary to RTP-based adaptive streaming, the packets have no difficulties traversing firewall and NAT devices. Second, since HTTP streaming is purely client-driven, all adaptation logic resides at the client. This reduces the requirement of persistent connections between server and client application. Furthermore, the server is not required to maintain session state information on each client, increasing scalability. Finally, existing HTTP delivery infrastructure, such as HTTP caches and servers can be seamlessly adopted.[11][12][13][14]
A scalable CDN is used to deliver media streaming to an Internet audience. The CDN receives the stream from the source at its Origin server, then replicates it to many or all of its Edge cache servers. The end-user requests the stream and is redirected to the "closest" Edge server. This can be tested using libdash[15] and the Distributed DASH (D-DASH) dataset,[16] which has several mirrors across Europe, Asia and the US. The use of HTTP-based adaptive streaming allows the Edge server to run a simple HTTP server software, whose licence cost is cheap or free, reducing software licensing cost, compared to costly media server licences (e.g. Adobe Flash Media Streaming Server). The CDN cost for HTTP streaming media is then similar to HTTP web caching CDN cost.
History
[edit]Adaptive bit rate over HTTP was created by the DVD Forum at the WG1 Special Streaming group in October 2002. The group was co-chaired by Toshiba and Phoenix Technologies, The expert group count with the collaboration of Microsoft, Apple Computer, DTS Inc., Warner Brothers, 20th Century Fox, Digital Deluxe, Disney, Macromedia and Akamai.[dubious – discuss][citation needed] The technology was originally called DVDoverIP and was an integral effort of the DVD ENAV book.[17] The concept came from storing MPEG-1 and MPEG-2 DVD TS Sectors into small 2KB files, which will be served using an HTTP server to the player. The MPEG-1 segments provided the lower bandwidth stream, while the MPEG-2 provided a higher bit rate stream. The original XML schema provided a simple playlist of bit rates, languages and url servers. The first working prototype was presented to the DVD Forum by Phoenix Technologies at the Harman Kardon Lab in Villingen Germany.[citation needed]
Implementations
[edit]Adaptive bit rate streaming was introduced by Move Networks in 2006 [18] and is now being developed and utilized by Adobe Systems, Apple, Microsoft and Octoshape.[19] In October 2010, Move Networks was awarded a patent for their adaptive bit rate streaming (US patent number 7818444).[20]
Dynamic Adaptive Streaming over HTTP (DASH)
[edit]Dynamic Adaptive Streaming over HTTP (DASH), also known as MPEG-DASH, is the only adaptive bit-rate HTTP-based streaming solution that is an international standard[21] MPEG-DASH technology was developed under MPEG. Work on DASH started in 2010 and became a Draft International Standard in January 2011 and an International Standard in November 2011.[21][22][23] The MPEG-DASH standard was published as ISO/IEC 23009-1:2012 in April, 2012.
MPEG-DASH is a technology related to Adobe Systems HTTP Dynamic Streaming, Apple Inc. HTTP Live Streaming (HLS) and Microsoft Smooth Streaming.[24] DASH is based on Adaptive HTTP streaming (AHS) in 3GPP Release 9 and on HTTP Adaptive Streaming (HAS) in Open IPTV Forum Release 2.[25] As part of their collaboration with MPEG, 3GPP Release 10 has adopted DASH (with specific codecs and operating modes) for use over wireless networks.[25]
The goal of standardizing an adaptive streaming solution is to assure the market that the solution can work universally, unlike other solutions that are more specific to certain vendors, such as Apple’s HLS, Microsoft’s Smooth Streaming, or Adobe’s HDS.
Available implementations are the HTML5-based bitdash MPEG-DASH player[26] as well as the open source C++-based DASH client access library libdash of bitmovin GmbH,[15] the DASH tools of the Institute of Information Technology (ITEC) at Alpen-Adria University Klagenfurt,[3][27] the multimedia framework of the GPAC group at Telecom ParisTech,[28] and the dash.js[29] player of the DASH-IF.
Apple HTTP Live Streaming (HLS)
[edit]HTTP Live Streaming (HLS) is an HTTP-based media streaming communications protocol implemented by Apple Inc. as part of QuickTime X and iOS. HLS supports both live and Video on demand content. It works by breaking down media streams or files into short pieces (media segments) which are stored as MPEG-TS or fragmented MP4 files. This is typically done at multiple bitrates using a stream or file segmenter application, also known as a packager. One such segmenter implementation is provided by Apple.[30] Additional packagers are available, including free / open source offerings like Google's Shaka Packager [31] and various commercial tools as well - such as Unified Streaming.[32] The segmenter is also responsible for producing a set of playlist files in the M3U8 format which describe the media chunks. Each playlist is specific to a given bitrate, and contains the relative or absolute URLs to the chunks for that bitrate. The client is then responsible for requesting the appropriate playlist depending on available bandwidth.
HTTP Live Streaming is a standard feature in the iPhone 3.0 and newer versions.[33]
Apple has submitted its solution to the IETF for consideration as an Informational Request for Comments.[34] This was officially accepted as RFC 8216 A number of proprietary and open source solutions exist for both the server implementation (segmenter) and the client player.
HLS streams can be identified by the playlist URL format extension of m3u8 or MIME type of application/vnd.apple.mpegurl.[35] These adaptive streams can be made available in many different bitrates and the client device interacts with the server to obtain the best available bitrate which can reliably be delivered.
Playback of HLS is supported on many platforms including Safari and native apps on macOS / iOS, Microsoft Edge on Windows 10, ExoPlayer on Android, and the Roku platform. Many Smart TVs also have native support for HLS. Playing HLS on other platforms like Chrome / Firefox is typically achieved via a browser / JavaScript player implementation. Many open source and commercial players are available including hls.js, video.js http-streaming, BitMovin, JWPlayer, THEOplayer, etc.
Adobe HTTP Dynamic Streaming (HDS)
[edit]"HTTP Dynamic streaming is the process of efficiently delivering streaming video to users by dynamically switching among different streams of varying quality and size during playback. This provides users with the best possible viewing experience their bandwidth and local computer hardware (CPU) can support. Another major goal of dynamic streaming is to make this process smooth and seamless to users, so that if up-scaling or down-scaling the quality of the stream is necessary, it is a smooth and nearly unnoticeable switch without disrupting the continuous playback."[36]
The latest versions of Flash Player and Flash Media Server support adaptive bit-rate streaming over the traditional RTMP protocol, as well as HTTP, similar to the HTTP-based solutions from Apple and Microsoft,[37] HTTP dynamic streaming being supported in Flash Player 10.1 and later.[38] HTTP-based streaming has the advantage of not requiring any firewall ports being opened outside of the normal ports used by web browsers. HTTP-based streaming also allows video fragments to be cached by browsers, proxies, and CDNs, drastically reducing the load on the source server.
Microsoft Smooth Streaming (MSS)
[edit]Smooth Streaming is an IIS Media Services extension that enables adaptive streaming of media to clients over HTTP.[39] The format specification is based on the ISO base media file format and standardized by Microsoft as the Protected Interoperable File Format.[40] Microsoft is actively involved with 3GPP, MPEG and DECE organizations' efforts to standardize adaptive bit-rate HTTP streaming. Microsoft provides Smooth Streaming Client software development kits for Silverlight and Windows Phone 7, as well as a Smooth Streaming Porting Kit that can be used for other client operating systems, such as Apple iOS, Android, and Linux.[41] IIS Media Services 4.0, released in November 2010, introduced a feature which enables Live Smooth Streaming H.264/AAC videos to be dynamically repackaged into the Apple HTTP Adaptive Streaming format and delivered to iOS devices without the need for re-encoding. Microsoft has successfully demonstrated delivery of both live and on-demand 1080p HD video with Smooth Streaming to Silverlight clients. In 2010, Microsoft also partnered with NVIDIA to demonstrate live streaming of 1080p stereoscopic 3D video to PCs equipped with NVIDIA 3D Vision technology.[42]
Common Media Application Format (CMAF)
[edit]CMAF is a presentation container format used for the delivery of both HLS and MPEG-DASH. Hence it is intended to simplify delivery of HTTP-based streaming media. It was proposed in 2016 by Apple and Microsoft and officially published in 2018.[43]
QuavStreams Adaptive Streaming over HTTP
[edit]QuavStreams Adaptive Streaming is a multimedia streaming technology developed by Quavlive. The streaming server is an HTTP server that has multiple versions of each video, encoded at different bitrates and resolutions. The server delivers the encoded video/audio frames switching from one level to another, according to the current available bandwidth. The control is entirely server-based, so the client does not need special additional features. The streaming control employs feedback control theory.[44] Currently, QuavStreams supports H.264/MP3 codecs muxed into the FLV container and VP8/Vorbis codecs muxed into the WEBM container.
Uplynk
[edit]Uplynk delivers HD adaptive bitrate streaming to multiple platforms, including iOS, Android, Windows Mac, Linux, and Roku, across various browser combinations, by encoding video in the cloud using a single non-proprietary adaptive streaming format. Rather than streaming and storing multiple formats for different platforms and devices, Uplynk stores and streams only one. The first studio to use this technology for delivery was Disney–ABC Television Group, using it for video encoding for web, mobile and tablet streaming apps on the ABC Player, ABC Family and Watch Disney apps, as well as the live Watch Disney Channel, Watch Disney Junior, and Watch Disney XD.[45][46]
Self-learning clients
[edit]In recent years, the benefits of self-learning algorithms in adaptive bitrate streaming have been investigated in academia. While most of the initial self-learning approaches are implemented at the server-side[47][48][49] (e.g. performing admission control using reinforcement learning or artificial neural networks), more recent research is focusing on the development of self-learning HTTP Adaptive Streaming clients. Multiple approaches have been presented in literature using the SARSA[50] or Q-learning[51] algorithm. In all of these approaches, the client state is modeled using, among others, information about the current perceived network throughput and buffer filling level. Based on this information, the self-learning client autonomously decides which quality level to select for the next video segment. The learning process is steered using feedback information, representing the Quality of Experience (QoE) (e.g. based on the quality level, the number of switches and the number of video freezes). Furthermore, it was shown that multi-agent Q-learning can be applied to improve QoE fairness among multiple adaptive streaming clients.[52]
Criticisms
[edit]HTTP-based adaptive bit rate technologies are significantly more operationally complex than traditional streaming technologies. Some of the documented considerations are things such as additional storage and encoding costs, and challenges with maintaining quality globally. There have also been some interesting dynamics found around the interactions between complex adaptive bit rate logic competing with complex TCP flow control logic.[11][53] [54] [55][56]
However, these criticisms have been outweighed in practice by the economics and scalability of HTTP delivery: whereas non-HTTP streaming solutions require massive deployment of specialized streaming server infrastructure, HTTP-based adaptive bit-rate streaming can leverage the same HTTP web servers used to deliver all other content over the Internet.[citation needed]
With no single clearly defined or open standard for the digital rights management used in the above methods, there is no 100% compatible way of delivering restricted or time-sensitive content to any device or player. This also proves to be a problem with digital rights management being employed by any streaming protocol.
The method of segmenting files into smaller files used by some implementations (as used by HTTP Live Streaming) could be deemed unnecessary due to the ability of HTTP clients to request byte ranges from a single video asset file that could have multiple video tracks at differing bit rates with the manifest file only indicating track number and bit rate. However, this approach allows for serving of chunks by any simple HTTP server and so therefore guarantees CDN compatibility. Implementations using byte ranges such as Microsoft Smooth Streaming require a dedicated HTTP server such as IIS to respond to the requests for video asset chunks.
See also
[edit]- Multiple description coding
- Hierarchical modulation – alternative with reduced storage and authoring demands
References
[edit]- ^ Saamer Akhshabi; Ali C. Begen; Constantine Dovrolis (2011). An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP. In Proceedings of the second annual ACM conference on Multimedia systems (MMSys '11). New York, NY, USA: ACM.
- ^ A. Bentaleb, B. Taani, A. Begen, C. Timmermer, and R. Zimmermann, "A Survey on Bitrate Adaptation Schemes for Streaming Media over HTTP", In IEEE Communications Surveys & (IEEE COMST), Volume 1 Issue 1, pp. 1-1, 2018.
- ^ a b DASH at ITEC, VLC Plugin, DASHEncoder and Dataset by C. Mueller, S. Lederer, C. Timmerer
- ^ a b "Proceedings Template – WORD" (PDF). Retrieved 16 December 2017.
- ^ Gannes, Liz (10 June 2009). "The Next Big Thing in Video: Adaptive Bitrate Streaming". Archived from the original on 19 June 2010. Retrieved 1 June 2010.
- ^ a b "mmsys2012-final36.pdf" (PDF). Retrieved 16 December 2017.
- ^ Spiteri, Kevin; Urgaonkar, Rahul; Sitaraman, Ramesh K. (2016). "BOLA: Near-optimal bitrate adaptation for online videos. IEEE INFOCOM, 2016, by Spiteri, Urgaonkar, and Sitaraman, IEEE INFOCOM, April 2016". arXiv:1601.06748. doi:10.1109/TNET.2020.2996964. S2CID 219792107.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ "From Theory to Practice: Improving Bitrate Adaptation in the DASH Reference Player, by Spiteri, Sitaraman and Sparacio, ACM Multimedia Systems Conference, June 2018" (PDF).
- ^ Marshall, Daniel (18 February 2010). "Show Report: Video Processing Critical to Digital Asset Management". Elemental Technologies. Archived from the original on 4 October 2011. Retrieved 15 October 2011.
- ^ Seufert, Michael; Egger, Sebastian; Slanina, Martin; Zinner, Thomas; Hoßfeld, Tobias; Tran-Gia, Phuoc (2015). "A Survey on Quality of Experience of HTTP Adaptive Streaming". IEEE Communications Surveys & Tutorials. 17 (1): 469–492. doi:10.1109/COMST.2014.2360940. S2CID 18220375.
- ^ a b Saamer Akhshabi; Ali C. Begen; Constantine Dovrolis. "An Experimental Evaluation of Rate-Adaptation Algorithms in Adaptive Streaming over HTTP" (PDF). Archived from the original (PDF) on 17 October 2011. Retrieved 15 October 2011.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Anthony Vetro. "The MPEG-DASH Standard for Multimedia Streaming Over the Internet" (PDF). Retrieved 10 July 2015.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Jan Ozer (28 April 2011). "What Is Adaptive Streaming?". Retrieved 10 July 2015.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Jeroen Famaey; Steven Latré; Niels Bouten; Wim Van de Meerssche; Bart de Vleeschauwer; Werner Van Leekwijck; Filip De Turck (May 2013). "On the merits of SVC-based HTTP Adaptive Streaming": 419–426. Retrieved 10 July 2015.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ a b libdash: Open-source DASH client library by bitmovin
- ^ "Distributed DASH Datset | ITEC – Dynamic Adaptive Streaming over HTTP". Itec.uni-klu.ac.at. Retrieved 16 December 2017.
- ^ DVD Book Construction, DVD Forum, May 2005
- ^ Yang, Hongyun (2014). "Opportunities and Challenges of HTTP Adaptive Streaming" (PDF). International Journal of Future Generation Communication and Networking. 7 (6): 165–180.
- ^ Gannes, Liz (10 June 2009). "The Lowdown on Apple's HTTP Adaptive Bitrate Streaming". Archived from the original on 19 June 2010. Retrieved 24 June 2010.
- ^ "Move Gets Streaming Patent; Are Adobe & Apple Hosed? – Online Video News". Gigaom.com. 15 September 2010. Archived from the original on 22 October 2011. Retrieved 15 October 2011.
- ^ a b "MPEG ratifies its draft standard for DASH". MPEG. 2 December 2011. Archived from the original on 20 August 2012. Retrieved 26 August 2012.
- ^ Timmerer, Christian (26 April 2012). "HTTP streaming of MPEG media – blog entry". Multimediacommunication.blogspot.com. Retrieved 16 December 2017.
- ^ "ISO/IEC DIS 23009-1.2 Dynamic adaptive streaming over HTTP (DASH)". Iso.org. Retrieved 16 December 2017.
- ^ Updates on DASH – blog entry
- ^ a b ETSI 3GPP 3GPP TS 26.247; Transparent end-to-end packet-switched streaming service (PSS); Progressive Download and Dynamic Adaptive Streaming over HTTP (3GP-DASH)
- ^ "bitdash HTML5 MPEG-DASH player". Dash-player.com. 22 January 2016. Archived from the original on 10 July 2016. Retrieved 16 December 2017.
- ^ "A VLC media player plugin enabling dynamic adaptive streaming over HTTP" (PDF). Retrieved 16 December 2017.
- ^ "GPAC Telecom ParisTech". Archived from the original on 24 February 2012. Retrieved 28 March 2013.
- ^ "dash.js". Github.com. Retrieved 16 December 2017.
- ^ Mac Developer Library, Apple, retrieved 2 June 2014
- ^ Shaka Packager Github Repository, Google, retrieved 3 January 2023
- ^ Unified Streaming, Unified Streaming, retrieved 3 January 2023
- ^ Prince McLean (9 July 2009). "Apple launches HTTP Live Streaming standard in iPhone 3.0". AppleInsider. Retrieved 15 October 2011.
- ^ R. Pantos, HTTP Live Streaming, IETF, retrieved 11 October 2011
- ^ RFC 8216. sec. 4. doi:10.17487/RFC8216.
- ^ Hassoun, David. "Dynamic streaming in Flash Media Server 3.5 – Part 1: Overview of the new capabilities". Adobe Developer Connection. Adobe Systems. Archived from the original on 30 March 2014.
- ^ "HTTP Dynamic Streaming". Adobe Systems. Retrieved 13 October 2010.
- ^ "FAQ HTTP Dynamic Streaming". Adobe Systems. Retrieved 12 January 2015.
- ^ "Smooth Streaming". IIS.net. Archived from the original on 15 June 2010. Retrieved 24 June 2010.
- ^ Chris Knowlton (8 September 2009), Protected Interoperable File Format, Microsoft, retrieved 15 October 2011
- ^ "Microsoft End-to-End Platform Powers Next-Generation Silverlight and IIS Media Experiences Across Multiple Screens". Microsoft. 8 April 2010. Retrieved 30 July 2011.
- ^ "First Day of IBC". Microsoft. Archived from the original on 2 February 2011. Retrieved 22 January 2011.
- ^ Traci Ruether (23 January 2019). "What Is CMAF?". Retrieved 13 January 2022.
- ^ Luca De Cicco; Saverio Mascolo; Vittorio Palmisano. "Feedback Control for Adaptive Live Video Streaming" (PDF). MMSYS2011. Retrieved 9 September 2012.
- ^ Dean Takahashi (16 January 2013). "Uplynk creates a cheap and efficient way for Disney to stream videos". VentureBeat. Retrieved 16 December 2017.
- ^ Dreier, Troy (16 January 2013). "UpLynk Emerges from Stealth Mode; DisneyABC Is First Customer – Streaming Media Magazine". Streamingmedia.com. Retrieved 16 December 2017.
- ^ Y. Fei; V. W. S. Wong; V. C. M. Leung (2006). "Efficient QoS provisioning for adaptive multimedia in mobile communication networks by reinforcement learning". Mobile Networks and Applications. 11 (1): 101–110. CiteSeerX 10.1.1.70.1430. doi:10.1007/s11036-005-4464-2. S2CID 13022779.
- ^ V. Charvillat; R. Grigoras (2007). "Reinforcement learning for dynamic multimedia adaptation". Journal of Network and Computer Applications. 30 (3): 1034–1058. doi:10.1016/j.jnca.2005.12.010.
- ^ D. W. McClary; V. R. Syrotiuk; V. Lecuire (2008). "Adaptive audio streaming in mobile ad hoc networks using neural networks". Ad Hoc Networks. 6 (4): 524–538. doi:10.1016/j.adhoc.2007.04.005.
- ^ V. Menkovski; A. Liotta (2013). "Intelligent control for adaptive video streaming". IEEE International Conference on Consumer Electronics (ICCE). Washington, DC. pp. 127–128. doi:10.1109/ICCE.2013.6486825.
- ^ M. Claeys; S. Latré; J. Famaey; F. De Turck (2014). "Design and evaluation of a self-learning HTTP adaptive video streaming client". IEEE Communications Letters. 18 (4): 716–719. doi:10.1109/lcomm.2014.020414.132649. hdl:1854/LU-5733061. S2CID 26955239.
- ^ S. Petrangeli; M. Claeys; S. Latré; J. Famaey; F. De Turck (2014). "A multi-agent Q-Learning-based framework for achieving fairness in HTTP Adaptive Streaming". IEEE Network Operations and Management Symposium (NOMS). Krakow. pp. 1–9. doi:10.1109/NOMS.2014.6838245.
- ^ Pete Mastin (28 January 2011). "Is adaptive bit rate the yellow brick road, or fool's gold for HD streaming?". Archived from the original on 7 September 2011. Retrieved 15 October 2011.
- ^ Luca De Cicco; Saverio Mascolo. "An Experimental Investigation of the Akamai Adaptive Video Streaming" (PDF). Retrieved 29 November 2011.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ "Adaptive streaming: a comparison". Archived from the original on 19 April 2014. Retrieved 17 April 2014.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ Chris Knowlton (28 January 2010). "Adaptive Streaming Comparison".
{{cite journal}}
: Cite journal requires|journal=
(help)
Further reading
[edit]- The Next Big Thing in Video: Adaptive Bitrate Streaming Archived 19 June 2010 at the Wayback Machine