0% found this document useful (0 votes)
43 views

CNProj Final Report

The document discusses the history and evolution of native video playback capabilities in web browsers. It covers: - Early web video relied on plugins like Flash due to the lack of native support in browsers. HTML5 added the <video> tag to allow native playback. - The <video> tag links directly to video files, similar to the <img> tag for images. Attributes like src, width, and height are used. - The Media Source Extensions specification allows dynamically updating video playback from JavaScript using MediaSource objects and SourceBuffers. - Video can be streamed in segments to enable features like adaptive bitrate streaming based on network conditions. Segments in multiple qualities are stored on servers.

Uploaded by

Pujit YG
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

CNProj Final Report

The document discusses the history and evolution of native video playback capabilities in web browsers. It covers: - Early web video relied on plugins like Flash due to the lack of native support in browsers. HTML5 added the <video> tag to allow native playback. - The <video> tag links directly to video files, similar to the <img> tag for images. Attributes like src, width, and height are used. - The Media Source Extensions specification allows dynamically updating video playback from JavaScript using MediaSource objects and SourceBuffers. - Video can be streamed in segments to enable features like adaptive bitrate streaming based on network conditions. Segments in multiple qualities are stored on servers.

Uploaded by

Pujit YG
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Need for a native video API

From the early to late 2000s, video playback on the web mostly relied on the flash plugin.
This was because, at the time, there was no other mean to stream video on a browser. Users had
the choice between either installing third-party plugins like flash or Silverlight, or not being able
to play any video at all.

To fill that hole, the WHATWG began to work on a new version of the HTML standard
including, among other things, video and audio playback natively.
This standard became what is now known as HTML5. Thus HTML5 brought, among other things,
the <video> tag to the web.This new tag allows you to link to a video directly from the HTML,
much like a <img> tag would do for an image.

The video tag

As described above, linking to a video in a page is pretty straightforward in HTML5, by just


adding a video tag in the page, with few attributes.

For example:

<html>
<head>
<meta charset="UTF-8">
<title>My Video</title>
</head>
<body>
<video src="some_video.mp4" width="1280px" height="720px" />
</body>
</html>
This HTML will allow the page to stream some_video.mp4 directly on any browser that supports
the corresponding codecs (and HTML5, of course).

Media Source Extensions

The “Media Source Extensions” (more often shortened to just “MSE”) is a specification from the
W3C that most browsers implement today. It was created to allow those complex media use cases
directly with HTML and JavaScript.

Those “extensions” add the MediaSource object to JavaScript. As its name suggests, this will be
the source of the video, or put more simply, this is the object representing our video’s data.

As described before, we still use the HTML5 video tag,and its src attribute. Only this time, we're
not adding a link to the video, we're adding a link to the MediaSource object.

To allow this kind of use cases the W3C defined the URL.createObjectURLstatic method. This
API allows creating an URL, which will actually refer not to a resource available online, but
directly to a JavaScript object created on the client.

This is thus how a MediaSource is attached to a video tag:


const videoTag = document.getElementById("my-video");

// creating the MediaSource, just with the "new" keyword, and the URL
for it
const myMediaSource = new MediaSource();
const url = URL.createObjectURL(myMediaSource);

// attaching the MediaSource to the video tag


videoTag.src = url;
Source Buffers

The video is not actually directly “pushed” into the MediaSource for playback, SourceBuffers are
used for that.

A MediaSource contains one or multiple instances of those. Each being associated with a type of
content.

To simplify, we have only three possible types:

 audio

 video

 both audio and video

SourceBuffers are all linked to a single MediaSource and each will be used to add our video’s
data to the HTML5 video tag directly in JavaScript.

As an example, a frequent use case is to have two source buffers on our MediaSource: one for the
video data, and the other for the audio.

Separating video and audio allows to also managing them separately on the server-side. Doing so
leads to several advantages as well. This is how it works:

const videoTag = document.getElementById("my-video");


const myMediaSource = new MediaSource();
const url = URL.createObjectURL(myMediaSource);
videoTag.src = url;

// 1. add source buffers


const audioSourceBuffer = myMediaSource
.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"');
const videoSourceBuffer = myMediaSource
.addSourceBuffer('video/mp4; codecs="avc1.64001e"');

// 2. download and add our audio/video to the SourceBuffers

// for the audio SourceBuffer


fetch("http://server.com/audio.mp4").then(function(response) {
// The data has to be a JavaScript ArrayBuffer
return response.arrayBuffer();
}).then(function(audioData) {
audioSourceBuffer.appendBuffer(audioData);
});

// the same for the video SourceBuffer


fetch("http://server.com/video.mp4").then(function(response) {
// The data has to be a JavaScript ArrayBuffer
return response.arrayBuffer();
}).then(function(videoData) {
videoSourceBuffer.appendBuffer(videoData);
});

Media Segments

In the previous code snippets, what we have assumed is that the audio and video files are two
whole files uploaded completely at once.
What actually happens in the advanced video players is that video and audio data are split into
multiple “segments”. These segments can come in various sizes, but they often represent between
2 to 10 seconds of content.
All those video/audio segments then form the complete video/audio content. Those “chunks” of
data add a whole new level of flexibility: instead of pushing the whole content at once, we can just
push progressively multiple segments.

The audio or video files might not truly be segmented on the server-side, the Range HTTP header
might be used instead by the client to obtain those files segmented.
This means that we thankfully do not have to wait for the whole audio or video content to be
downloaded to begin playback. We often just need the first segment of each.

Adaptive Streaming

Many video players have an “auto quality” feature, where the quality is automatically chosen
depending on the user’s network and processing capabilities. This is a central concern of a web
player called adaptive streaming. This behavior is also enabled thanks to the concept of media
segments.

On the server-side, the segments are actually encoded in multiple qualities. For example, our
server could have the following files stored:
./audio/
├── ./128kbps/
| ├── segment0.mp4
| ├── segment1.mp4
| └── segment2.mp4
└── ./320kbps/
├── segment0.mp4
├── segment1.mp4
└── segment2.mp4./video/
├── ./240p/
| ├── segment0.mp4
| ├── segment1.mp4
| └── segment2.mp4
└── ./720p/
├── segment0.mp4
├── segment1.mp4
└── segment2.mp4

A web player will then automatically choose the right segments to download as the network or
CPU conditions change.

This is entirely done in JavaScript. For audio segments, it could, for example, look like the
following:

/**
* Push audio segment in the source buffer based on its number
* and quality
* @param {number} nb
* @param {string} language
* @param {string} wantedQuality
* @returns {Promise}
*/
function pushAudioSegment(nb, wantedQuality) {
// The url begins to be a little more complex here:
const url = "http://my-server/audio/" +
wantedQuality + "/segment" + nb + ".mp4");
return fetch(url)
.then((response) => response.arrayBuffer());
.then(function(arrayBuffer) {
audioSourceBuffer.appendBuffer(arrayBuffer);
});
}
/**
* Translate an estimated bandwidth to the right audio
* quality as defined on server-side.
* @param {number} bandwidth
* @returns {string}
*/
function fromBandwidthToQuality(bandwidth) {
return bandwidth > 320e3 ? "320kpbs" : "128kbps";
}
// first estimate the bandwidth. Most often, this is based on
// the time it took to download the last segments
const bandwidth = estimateBandwidth();
const quality = fromBandwidthToQuality(bandwidth);
pushAudioSegment(0, quality)
.then(() => pushAudioSegment(1, quality))
.then(() => pushAudioSegment(2, quality));

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy