videojs / mux.js Goto Github PK
View Code? Open in Web Editor NEWLightweight utilities for inspecting and manipulating video container formats.
License: Apache License 2.0
Lightweight utilities for inspecting and manipulating video container formats.
License: Apache License 2.0
We currently generate mp4 fragments around flush() calls, which correspond to segment boundaries when the input in MPEG2-TS. This causes us to split GOPs, which means the later fragment is missing the reference frame and we have to drop some video data. If we fragmented on keyframes instead, we wouldn't have this problem.
I have a stream of NAL packets coming in through the websocket. I would like to know how to setup so sent the NAL units into the Transmuxer so it output is a mp4. I was trying like this but i did not see any events getting triggered
let muxer = new muxjs.Transmuxer();
muxer.init();
muxer.on( 'data', segment =>
{
console.log( segment );
}
function onNalUnitReceived( nalUnit )
{
muxer.push( nalUnit );
}
How do i convert the nalUnit i receive in onNalUnitReceived
callback to an mp4
Hello,
I apologize in advance if this is not a mux.js issue but my tests indicate that it might be.
In my current project I get mp2t packets via a websocket and I am converting those packets to fmp4 using mux.js which I then send to the browser (Windows 10 Chrome Version 60.0.3112.90) via the MSE pipeline. This works great and I am able to see the video for a little while until I get a PIPELINE_ERROR_DECODE error and the stream shuts down. This usually occurs within 30-40 minutes after the video starts playing.
In an effort to narrow down the problem I took the debug index.html (which works great when loading the whole file at a time) and modified it to take a file, read it and then cut it down to 40k slices which I then then feed to the mux.js at a fast rate. I take the output from the βdoneβ event and append it to the MSE buffer. Doing this I also seem to get the PIPELINE_ERROR_DECODE from the browser after the video displays for a few minutes.
I cannot include the file I am using for testing (cause its too big) but any .ts file that is over 20 minutes long with 24 frames per second can be loaded to test this.
Here is the html page I have modified to slice the .ts file:
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title></title>
<meta name="description" content="">
<meta name="viewport" content="width=device-width">
<link rel="stylesheet" href="css/nomalize.min.css">
<link rel="stylesheet" href="css/main.css">
<script src="js/vendor/modernizr-2.6.2.min.js"></script>
<style>
fieldset {
padding: 10px;
border: 1px solid #ccc;
}
</style>
</head>
<body>
<!--[if lt IE 7]>
<p class="chromeframe">You are using an <strong>outdated</strong> browser. Please <a href="http://browsehappy.com/">upgrade your browser</a> or <a href="http://www.google.com/chromeframe/?redirect=true">activate Google Chrome Frame</a> to improve your experience.</p>
<![endif]-->
<div class="header-container">
<header class="wrapper clearfix">
<h1 class="title">Transmux Analyzer</h1>
</header>
</div>
<div class="main-container">
<div class="main wrapper clearfix">
<article>
<header>
<p>
This page can help you inspect the results of the transmuxing to mp4 files performed by videojs-contrib-hls. It's still a
bit tricky to create a MSE-compatible fragmented MP4. We've had luck with
<a
href="http://www.bento4.com/developers/dash/">Bento4</a>
and ffmpeg. If you have both of those utilities installed, you can create a working MP4 like this:
<pre>
ffmpeg -i movie.ts -vn -codec copy -absf aac_adtstoasc movie-audio.mp4
mp4fragment --track audio --fragment-duration 11000 movie-audio.mp4 movie-audio.m4s</pre>
<small>Looking for the <a href="legacy.html">FLV tool</a>?</small>
</header>
<section>
<h2>Inputs</h2>
<form id="inputs">
<legend>
The input with the checked radio box will be loaded into the player on this page.
</legend>
<fieldset>
<legend>M2TS Input</legend>
<div>
<input id="original-active" type=radio name=active checked value="original">
<label>
Your original .TS or .AAC segment:
<input type="file" id="original">
</label>
<button id="save-muxed" type="button">Save Transmuxer Output</button>
</div>
<div>
<label><input id="combined-output" type=checkbox name=combined checked value="combined"> Remux output into a single output?
</label>
</div>
<div>
Otherwise, output only:
<label><input id="video-output" type=radio name=output disabled checked value="video"> Video
</label>
<label><input id="audio-output" type=radio name=output disabled checked value="audio"> Audio
</label>
</div>
<div>
<label><input id="reset-tranmsuxer" type=checkbox name=reset checked value="reset"> Recreate the Transmuxer & MediaSource for each file open?
</label>
</div>
</fieldset>
<fieldset>
<legend>MP4 Input</legend>
<input id="working-active" type=radio name=active value="working">
<label>
A working, MP4 version of the underlying stream
produced by another tool:
<input type="file" id="working">
</label>
</fieldset>
<div>
<label>
Codecs:
<input id="codecs" type="text" value="avc1.64001f,mp4a.40.5">
</label>
</div>
</form>
</section>
<section id="video-place">
</section>
<section>
<h2>Comparison</h2>
<div id="comparison">
A diff of the structure of the two MP4s will appear here once you've specified an input TS file and a known working MP4.
</div>
</section>
<section>
<h2>Structure</h2>
<div class="result-wrapper">
<h3>videojs-contrib-hls</h3>
<pre class="vjs-boxes">
</pre>
</div>
<div class="result-wrapper">
<h3>Working</h3>
<pre class="working-boxes"></pre>
</div>
</section>
</article>
</div>
<!-- #main -->
</div>
<!-- #main-container -->
<div class="footer-container">
<footer class="wrapper">
<h3>footer</h3>
</footer>
</div>
<script src="../bower_modules/mux.js/dist/mux.js"></script>
<!-- Include QUnit for object diffs -->
<script src="../bower_modules/mux.js/node_modules/qunitjs/qunit/qunit.js"></script>
<script>
/*! @source http://purl.eligrey.com/github/FileSaver.js/blob/master/FileSaver.js */
window.saveAs = function (view) { "use strict"; if (typeof navigator !== "undefined" && /MSIE [1-9]\./.test(navigator.userAgent)) { return } var doc = view.document, get_URL = function () { return view.URL || view.webkitURL || view }, save_link = doc.createElementNS("http://www.w3.org/1999/xhtml", "a"), can_use_save_link = "download" in save_link, click = function (node) { var event = new MouseEvent("click"); node.dispatchEvent(event) }, is_safari = /Version\/[\d\.]+.*Safari/.test(navigator.userAgent), webkit_req_fs = view.webkitRequestFileSystem, req_fs = view.requestFileSystem || webkit_req_fs || view.mozRequestFileSystem, throw_outside = function (ex) { (view.setImmediate || view.setTimeout)(function () { throw ex }, 0) }, force_saveable_type = "application/octet-stream", fs_min_size = 0, arbitrary_revoke_timeout = 500, revoke = function (file) { var revoker = function () { if (typeof file === "string") { get_URL().revokeObjectURL(file) } else { file.remove() } }; if (view.chrome) { revoker() } else { setTimeout(revoker, arbitrary_revoke_timeout) } }, dispatch = function (filesaver, event_types, event) { event_types = [].concat(event_types); var i = event_types.length; while (i--) { var listener = filesaver["on" + event_types[i]]; if (typeof listener === "function") { try { listener.call(filesaver, event || filesaver) } catch (ex) { throw_outside(ex) } } } }, auto_bom = function (blob) { if (/^\s*(?:text\/\S*|application\/xml|\S*\/\S*\+xml)\s*;.*charset\s*=\s*utf-8/i.test(blob.type)) { return new Blob(["\ufeff", blob], { type: blob.type }) } return blob }, FileSaver = function (blob, name, no_auto_bom) { if (!no_auto_bom) { blob = auto_bom(blob) } var filesaver = this, type = blob.type, blob_changed = false, object_url, target_view, dispatch_all = function () { dispatch(filesaver, "writestart progress write writeend".split(" ")) }, fs_error = function () { if (target_view && is_safari && typeof FileReader !== "undefined") { var reader = new FileReader; reader.onloadend = function () { var base64Data = reader.result; target_view.location.href = "data:attachment/file" + base64Data.slice(base64Data.search(/[,;]/)); filesaver.readyState = filesaver.DONE; dispatch_all() }; reader.readAsDataURL(blob); filesaver.readyState = filesaver.INIT; return } if (blob_changed || !object_url) { object_url = get_URL().createObjectURL(blob) } if (target_view) { target_view.location.href = object_url } else { var new_tab = view.open(object_url, "_blank"); if (new_tab == undefined && is_safari) { view.location.href = object_url } } filesaver.readyState = filesaver.DONE; dispatch_all(); revoke(object_url) }, abortable = function (func) { return function () { if (filesaver.readyState !== filesaver.DONE) { return func.apply(this, arguments) } } }, create_if_not_found = { create: true, exclusive: false }, slice; filesaver.readyState = filesaver.INIT; if (!name) { name = "download" } if (can_use_save_link) { object_url = get_URL().createObjectURL(blob); setTimeout(function () { save_link.href = object_url; save_link.download = name; click(save_link); dispatch_all(); revoke(object_url); filesaver.readyState = filesaver.DONE }); return } if (view.chrome && type && type !== force_saveable_type) { slice = blob.slice || blob.webkitSlice; blob = slice.call(blob, 0, blob.size, force_saveable_type); blob_changed = true } if (webkit_req_fs && name !== "download") { name += ".download" } if (type === force_saveable_type || webkit_req_fs) { target_view = view } if (!req_fs) { fs_error(); return } fs_min_size += blob.size; req_fs(view.TEMPORARY, fs_min_size, abortable(function (fs) { fs.root.getDirectory("saved", create_if_not_found, abortable(function (dir) { var save = function () { dir.getFile(name, create_if_not_found, abortable(function (file) { file.createWriter(abortable(function (writer) { writer.onwriteend = function (event) { target_view.location.href = file.toURL(); filesaver.readyState = filesaver.DONE; dispatch(filesaver, "writeend", event); revoke(file) }; writer.onerror = function () { var error = writer.error; if (error.code !== error.ABORT_ERR) { fs_error() } }; "writestart progress write abort".split(" ").forEach(function (event) { writer["on" + event] = filesaver["on" + event] }); writer.write(blob); filesaver.abort = function () { writer.abort(); filesaver.readyState = filesaver.DONE }; filesaver.readyState = filesaver.WRITING }), fs_error) }), fs_error) }; dir.getFile(name, { create: false }, abortable(function (file) { file.remove(); save() }), abortable(function (ex) { if (ex.code === ex.NOT_FOUND_ERR) { save() } else { fs_error() } })) }), fs_error) }), fs_error) }, FS_proto = FileSaver.prototype, saveAs = function (blob, name, no_auto_bom) { return new FileSaver(blob, name, no_auto_bom) }; if (typeof navigator !== "undefined" && navigator.msSaveOrOpenBlob) { return function (blob, name, no_auto_bom) { if (!no_auto_bom) { blob = auto_bom(blob) } return navigator.msSaveOrOpenBlob(blob, name || "download") } } FS_proto.abort = function () { var filesaver = this; filesaver.readyState = filesaver.DONE; dispatch(filesaver, "abort") }; FS_proto.readyState = FS_proto.INIT = 0; FS_proto.WRITING = 1; FS_proto.DONE = 2; FS_proto.error = FS_proto.onwritestart = FS_proto.onprogress = FS_proto.onwrite = FS_proto.onabort = FS_proto.onerror = FS_proto.onwriteend = null; return saveAs }(typeof self !== "undefined" && self || typeof window !== "undefined" && window || this.content); if (typeof module !== "undefined" && module.exports) { module.exports.saveAs = saveAs } else if (typeof define !== "undefined" && define !== null && define.amd != null) { define([], function () { return saveAs }) }
</script>
<script>
'use strict';
var
$ = document.querySelector.bind(document),
inputs = $('#inputs'),
original = $('#original'),
working = $('#working'),
saveButton = $('#save-muxed'),
vjsParsed,
workingParsed,
diffParsed,
vjsBytes,
workingBytes,
saveConfig,
restoreConfig,
saveTA,
vjsBoxes = $('.vjs-boxes'),
workingBoxes = $('.working-boxes'),
workingBoxes = $('.working-boxes'),
muxedData,
muxedName,
transmuxer,
video,
mediaSource,
logevent,
prepareSourceBuffer;
logevent = function (event) {
console.log(event.type);
};
saveTA = function (tarr, n) { var b = new Blob([tarr], { type: 'application/octet-binary' }); return window.saveAs(b, n); }
saveConfig = function () {
var inputs = [].slice.call(document.querySelectorAll('input:not([type=file])'));
inputs.forEach(function (element) {
localStorage.setItem(element.id,
JSON.stringify({
value: element.value,
checked: element.checked,
disabled: element.disabled
}));
});
};
restoreConfig = function () {
var inputs = [].slice.call(document.querySelectorAll('input:not([type=file])'));
inputs.forEach(function (element) {
var state;
state = JSON.parse(localStorage.getItem(element.id));
if (state) {
element.checked = state.checked;
element.value = state.value;
element.disabled = state.disabled;
}
});
};
document.addEventListener('DOMContentLoaded', restoreConfig);
// output a diff of the two parsed MP4s
diffParsed = function () {
var comparison, diff, transmuxed;
if (!vjsParsed || !workingParsed) {
// wait until both inputs have been provided
return;
}
comparison = $('#comparison');
transmuxed = vjsParsed;
diff = '<p>A <del>red background</del> indicates ' +
'properties present in the transmuxed file but missing from the ' +
'working version. A <ins>green background</ins> indicates ' +
'properties present in the working version but missing in the ' +
'transmuxed output.</p>';
diff += '<pre class="mp4-diff">' +
QUnit.diff(muxjs.mp4.tools.textify(transmuxed, null, ' '),
muxjs.mp4.tools.textify(workingParsed, null, ' ')) +
'</pre>';
comparison.innerHTML = diff;
};
prepareSourceBuffer = function (combined, outputType, callback) {
var
buffer,
codecs,
codecsArray,
resetTransmuxer = $('#reset-tranmsuxer').checked;
if (typeof combined === 'function') {
callback = combined;
combined = true;
}
// Our work here is done if the sourcebuffer has already been created
if (!resetTransmuxer && window.vjsBuffer) {
return callback();
}
video = document.createElement('video');
video.controls = true;
mediaSource = new MediaSource();
video.src = URL.createObjectURL(mediaSource);
window.vjsVideo = video;
window.vjsMediaSource = mediaSource;
$('#video-place').innerHTML = '';
$('#video-place').appendChild(video);
mediaSource.addEventListener('error', logevent);
mediaSource.addEventListener('opened', logevent);
mediaSource.addEventListener('closed', logevent);
mediaSource.addEventListener('sourceended', logevent);
codecs = $('#codecs');
codecsArray = codecs.value.split(',');
mediaSource.addEventListener('sourceopen', function () {
mediaSource.duration = 0;
if (combined) {
buffer = mediaSource.addSourceBuffer('video/mp4;codecs="' + codecs.value + '"');
} else if (outputType === 'video') {
buffer = mediaSource.addSourceBuffer('video/mp4;codecs="' + codecsArray[0] + '"');
} else if (outputType === 'audio') {
buffer = mediaSource.addSourceBuffer('audio/mp4;codecs="' + (codecsArray[1] || codecsArray[0]) + '"');
}
buffer.addEventListener('updatestart', logevent);
buffer.addEventListener('updateend', logevent);
buffer.addEventListener('error', logevent);
window.vjsBuffer = buffer;
video.addEventListener('error', logevent);
video.addEventListener('error', function () {
document.getElementById('video-place').classList.add('error');
});
return callback();
});
};
var fileSlices = [],
sliceLength = 40000,
sliceIndex = 0,
outputType = 'video',
remuxedSegments = [],
remuxedBytesLength = 0,
remuxedInitSegment = null,
bytes,
createInitSegment = true, i, j,
sliceDataBuffer = null,
myTimerFunction = null,
sliceCounterPackets = 0,
sliceFlushLimit = 24,
sliceMAXCounterPackets = 60;
transmuxer = new muxjs.mp4.Transmuxer({ remux: false });
function sliceFile(segment)
{
if(segment && segment != null) {
var isSlicing = true;
while(isSlicing)
{
fileSlices.push(segment.slice(sliceIndex, sliceIndex+ sliceLength));
sliceIndex = sliceIndex + sliceLength;
if(sliceLength > segment.byteLength - sliceIndex)
{
fileSlices.push(segment.slice(sliceIndex, segment.byteLength));
isSlicing = false;
}
}
}
}
transmuxer.on('data', function (event) {
if (event.type === outputType) {
remuxedSegments.push(event);
remuxedBytesLength += event.data.byteLength;
remuxedInitSegment = event.initSegment;
}
});
transmuxer.on('done', function () {
var offset = 0;
if (createInitSegment) {
bytes = new Uint8Array(remuxedInitSegment.byteLength + remuxedBytesLength)
bytes.set(remuxedInitSegment, offset);
offset += remuxedInitSegment.byteLength;
createInitSegment = false;
} else {
bytes = new Uint8Array(remuxedBytesLength);
}
for (j = 0, i = offset; j < remuxedSegments.length; j++) {
bytes.set(remuxedSegments[j].data, i);
i += remuxedSegments[j].byteLength;
}
muxedData = bytes;
remuxedSegments = [];
remuxedBytesLength = 0;
if ($('#original-active').checked) {
prepareSourceBuffer(false, outputType, function () {
console.log('appending...');
try {
if(!window.vjsBuffer.updating) {
window.vjsBuffer.appendBuffer(bytes);
video.play();
} else {
console.log('dropped Packets...');
}
} catch (exp)
{
console.error("Append Error: "+ exp.stack);
}
});
}
});
function concatTypedArrays(a, b) { // a, b TypedArray of same type
var c = new (a.constructor)(a.length + b.length);
c.set(a, 0);
c.set(b, a.length);
return c;
}
original.addEventListener('change', function () {
var reader = new FileReader();
// do nothing if no file was chosen
if (!this.files[0]) {
return;
}
reader.addEventListener('loadend', function () {
var segment = new Uint8Array(reader.result),
combined = document.querySelector('#combined-output').checked,
outputType = document.querySelector('input[name="output"]:checked').value,
resetTransmuxer = $('#reset-tranmsuxer').checked,
remuxedSegments = [],
remuxedInitSegment = null,
remuxedBytesLength = 0,
createInitSegment = false,
bytes,
i, j;
sliceFile(segment);
myTimerFunction = setInterval( addSlicesToQueue, 100);
function addSlicesToQueue() {
if (fileSlices.length > 0) {
if (sliceDataBuffer == null) {
sliceDataBuffer = new Uint8Array(0);
}
sliceDataBuffer = concatTypedArrays(sliceDataBuffer, fileSlices.shift());
if (sliceCounterPackets > sliceFlushLimit) {
transmuxer.push(sliceDataBuffer);
transmuxer.flush();
sliceDataBuffer = null;
sliceCounterPackets = 0;
if (sliceFlushLimit < sliceMAXCounterPackets) {
sliceFlushLimit++;
}
} else {
sliceCounterPackets++;
}
} else {
clearInterval(myTimerFunction);
}
}
});
muxedName = this.files[0].name.replace('.ts', '.f4m');
reader.readAsArrayBuffer(this.files[0]);
}, false);
working.addEventListener('change', function () {
var reader = new FileReader();
reader.addEventListener('loadend', function () {
var bytes = new Uint8Array(reader.result);
if ($('#working-active').checked) {
prepareSourceBuffer(function () {
window.vjsBuffer.appendBuffer(bytes);
video.play();
});
}
workingBytes = bytes;
workingParsed = muxjs.mp4.tools.inspect(bytes);
console.log('working', workingParsed);
diffParsed();
// clear old box info
workingBoxes.innerHTML = muxjs.mp4.tools.textify(workingParsed, null, ' ');
});
reader.readAsArrayBuffer(this.files[0]);
}, false);
$('#save-muxed').addEventListener('click', function () {
if (muxedData && muxedName) {
return saveTA(muxedData, muxedName);
}
});
$('#combined-output').addEventListener('change', function () {
Array.prototype.slice.call(document.querySelectorAll('[name="output"'))
.forEach(function (el) {
el.disabled = this.checked;
}, this);
});
[].slice.call(document.querySelectorAll('input')).forEach(function (el) {
el.addEventListener('change', saveConfig);
});
</script>
</body>
</html>
Here is the error log I get from the chrome://media-internals when I run that page.
Timestamp | Property | Value |
---|---|---|
00:00:00 00 | pipeline_state | kCreated |
00:00:00 00 | origin_url | http://127.0.0.1:8080/ |
00:00:00 00 | url | blob:http://127.0.0.1:8080/9f477dc7-3a9e-4a62-8712-921cf2a59d61 |
00:00:00 00 | pipeline_state | kStarting |
00:00:00 05 | found_video_stream | true |
00:00:00 05 | video_codec_name | h264 |
00:00:00 06 | debug | Video rendering in low delay mode. |
00:00:00 06 | video_dds | false |
00:00:00 06 | video_decoder | FFmpegVideoDecoder |
00:00:00 06 | pipeline_state | kPlaying |
00:00:00 09 | video_buffering_state | BUFFERING_HAVE_ENOUGH |
00:00:00 48 | height | 480 |
00:00:00 48 | width | 640 |
00:00:00 48 | pipeline_buffering_state | BUFFERING_HAVE_ENOUGH |
00:00:00 48 | event | PLAY |
00:00:00 08 | duration | 14.124988 |
00:00:02 698 | duration | 15.958321 |
00:00:02 700 | duration | 29.624977 |
00:00:05 480 | duration | 30.958321 |
00:00:05 483 | duration | 45.666654 |
00:00:08 375 | duration | 46.916665 |
00:00:08 378 | duration | 62.166654 |
00:00:11 371 | duration | 63.916665 |
00:00:11 374 | duration | 79.374977 |
00:00:14 483 | duration | 80.916665 |
00:00:14 489 | duration | 97.083343 |
00:00:17 676 | duration | 98.958355 |
00:00:17 680 | duration | 115.416688 |
00:00:20 981 | duration | 116.916665 |
00:00:20 982 | duration | 134.333343 |
00:00:24 374 | duration | 135.916699 |
00:00:24 376 | duration | 153.916688 |
00:00:27 875 | duration | 154.916665 |
00:00:27 878 | duration | 173.958343 |
00:00:31 476 | duration | 174.916699 |
00:00:31 479 | duration | 194.458377 |
00:00:35 456 | duration | 195.916699 |
00:00:35 459 | duration | 215.708377 |
00:00:39 187 | duration | 216.916699 |
00:00:39 189 | duration | 237.333377 |
00:00:43 88 | duration | 238.916699 |
00:00:43 90 | duration | 259.708377 |
00:00:47 86 | duration | 260.875021 |
00:00:47 88 | duration | 282.541688 |
00:00:51 180 | duration | 283.874988 |
00:00:51 184 | duration | 306.041654 |
00:00:55 382 | duration | 307.874999 |
00:00:55 390 | duration | 330.041666 |
00:00:59 685 | duration | 331.916677 |
00:00:59 687 | duration | 354.624988 |
00:01:04 81 | duration | 355.874988 |
00:01:04 84 | duration | 379.833343 |
00:01:08 589 | duration | 380.875021 |
00:01:08 597 | duration | 405.541688 |
00:01:13 188 | duration | 406.916665 |
00:01:13 191 | duration | 431.916654 |
00:01:17 884 | duration | 432.874988 |
00:01:17 886 | duration | 458.791654 |
00:01:22 687 | duration | 459.874988 |
00:01:22 695 | duration | 486.083343 |
00:01:27 590 | duration | 487.875021 |
00:01:27 592 | duration | 514.12501 |
00:01:32 597 | duration | 515.874988 |
00:01:32 606 | duration | 542.874977 |
00:01:37 692 | duration | 543.833321 |
00:01:37 695 | duration | 572.041654 |
00:01:42 888 | duration | 573.874999 |
00:01:42 891 | duration | 601.833354 |
00:01:48 204 | duration | 602.916699 |
00:01:48 207 | duration | 632.083377 |
00:01:53 618 | duration | 634.000021 |
00:01:53 622 | duration | 663.041688 |
00:01:59 109 | duration | 664.833332 |
00:01:59 113 | duration | 694.416666 |
00:02:04 701 | duration | 695.874988 |
00:02:04 706 | duration | 726.499977 |
00:02:10 399 | duration | 727.833321 |
00:02:10 404 | duration | 759.083343 |
00:02:16 213 | duration | 760.958355 |
00:02:16 216 | duration | 792.208377 |
00:02:22 111 | duration | 793.833355 |
00:02:22 114 | duration | 826.083377 |
00:02:28 101 | duration | 827.958355 |
00:02:28 104 | duration | 860.333377 |
00:02:34 212 | duration | 861.833355 |
00:02:34 214 | duration | 895.166688 |
00:02:40 399 | duration | 896.833321 |
00:02:40 402 | duration | 930.708343 |
00:02:46 602 | duration | 931.833355 |
00:02:46 604 | duration | 966.083377 |
00:02:52 812 | duration | 967.875021 |
00:02:52 825 | duration | 1001.583377 |
00:02:59 08 | duration | 1002.833355 |
00:02:59 12 | duration | 1037.083377 |
00:03:05 197 | duration | 1038.916699 |
00:03:05 201 | duration | 1072.583377 |
00:03:11 402 | duration | 1073.833355 |
00:03:11 405 | duration | 1108.083377 |
00:03:17 626 | duration | 1109.958355 |
00:03:17 631 | duration | 1143.50001 |
00:03:23 808 | duration | 1144.83331 |
00:03:23 812 | duration | 1178.999977 |
00:03:30 15 | duration | 1180.833332 |
00:03:30 20 | duration | 1214.499988 |
00:03:36 217 | duration | 1215.833321 |
00:03:36 233 | duration | 1249.999977 |
00:03:42 401 | duration | 1250.874988 |
00:03:42 404 | duration | 1285.333343 |
00:03:48 607 | duration | 1286.833355 |
00:03:48 610 | duration | 1320.916688 |
00:03:54 801 | duration | 1321.874988 |
00:03:54 805 | duration | 1356.208343 |
00:04:01 00 | duration | 1357.833355 |
00:04:01 14 | duration | 1391.791688 |
00:04:07 195 | duration | 1392.833321 |
00:04:07 200 | duration | 1427.083343 |
00:06:19 701 | debug | FFmpegVideoDecoder: avcodec_decode_video2(): Invalid data found when processing input, at timestamp: 379833355 duration: 0 size: 1726 side_data_size: 0 is_key_frame: 0 encrypted: 0 discard_padding (ms): (0, 0) |
00:06:19 701 | error | video decode error |
00:06:19 735 | pipeline_error | PIPELINE_ERROR_DECODE |
00:06:19 735 | pipeline_state | kStopping |
00:06:19 735 | pipeline_state | kStopped |
00:06:19 735 | event | PAUSE |
Thank you very much in advance for your help.
.
Hello,
is possible to use Transmuxer object in node.js app?
Without a user-provided codec parameter we can't know whether an AAC audio track is AAC-LC or HE-AAC without parsing the actual ADTS frame to check for a fil_element
specifying an extension payload that contains SBR
data. (So called "implicit signaling")
We need to know the codec to use in order to set up an audio sourceBuffer
to correctly decode an audio track.
Allow users to specify a codec=
after the video/m2ts
string to the addSourceBuffer
function in videojs-contrib-media-sources
videojs-contrib-media-sources
vjs-contrib-media-sources
project (ex. videojs-contrib-hls) must know about the codec used by the TS segments it is about to pass in. Simple enough with later version of HLS that mandate a CODEC parameter in the manifest.Parse the AAC raw_data_block
to try to determine whether we are getting an HE-AAC or not and signal that to the consumer of the transmuxer to allow it to setup the MSE sourceBuffer
correctly.
contrib-media-sources
which feels wrongcodecs=
parameter is specified at sourceBuffer
creationIf you seek before buffering any content, how is BaseMediaDecodeTime calculated?
The Chrome bug was fixed back in February and shipped in release version 50.
https://bugs.chromium.org/p/chromium/issues/detail?id=229412#c77
This code can now be removed (as well as code for other containers)
https://github.com/videojs/mux.js/blob/master/lib/mp4/transmuxer.js#L319
cc: @wolenetz
videojs/videojs-contrib-hls#1279
If the PAT and PMT packets are not at the start of the segment, PES packets are cached to be processed once the PMT is found. Currently these packets are still lost because they get processed before the transmuxer has a chance to setup the rest of the pipeline
add an npm script to build on prepublish
mux.js/lib/tools/mp4-inspector.js
Line 666 in 6d7173e
In the above, if the box is using version 1, then the compositionTimeOffset should be read as a singed integer as it can be negative. However, mp4-generator is forcing the version to be 0:
mux.js/lib/mp4/mp4-generator.js
Line 671 in 7ea589b
We should probably change mp4-generator to write in the correct version number if signed numbers are encountered.
Mux.js converts the following segments (data event is triggered) but once the segments are used in the sourcebuffer the sourcebuffer breaks. (Source buffer is removed exception)
Does anyone know why? Is it because the TS packets has a black screen which can't correctly be converted? Or do suddenly the codecs switch (because of the converter)? They are all from the same livestream and works fine in Safari.
Here are the TS packets:
https://www.dropbox.com/s/c6or56ssal5oqxn/TS%20Segments.zip?dl=0
Hello!
If you use an event from mux.js to (say) remove yourself from further events, because you had a one-time job, the code you have in place will misbehave in the for-loop going through all the callbacks.
I found a patch for this in our project: hola@f02fea0 @bahaa-aidi
I'd like to get it off our hands and into upstream (you). An alternative route fixing this, and I can do this if you'd accept it, is to never mutate the current Array when you do on/off, but always make a copy of the Array to listeners.
That way using it will be safe, and there's no overhead in trigger
, only in on
and off
which hopefully is called less often.
I'm using version 4.1.3 of mux.js. I ran into a problem today where the height and width in the tkhd boxes were being calculated incorrectly by the MP4 Inspector. It turns out the fractional portion of the tkhd box's width and height were being divided by 16 instead of 2^16 (65536) - perhaps it was intended to be a shift right? Anyhow, I have a PR with a fix that I can submit, I'll do so in a second.
Hi,
var fs = require('fs');
var mp2t = require('./lib/m2ts'),
codecs = require('./lib/codecs'),
aac = require('./lib/aac'),
flv = require('./lib/flv'),
id3Generator = require('./test/utils/id3-generator'),
mp4 = require('./lib/mp4'),
mp4AudioProperties = require('./lib/mp4/transmuxer').AUDIO_PROPERTIES,
mp4VideoProperties = require('./lib/mp4/transmuxer').VIDEO_PROPERTIES,
Transmuxer = mp4.Transmuxer,
FlvTransmuxer = flv.Transmuxer;
var segments = [];
var transmuxer = new Transmuxer();
transmuxer.on('data', function(segment) {
console.log("Data event");
segments.push(segment);
});
transmuxer.on('done', function(segment) {
console.log("Done event");
segments.push(segment);
});
fs.readFile('aac.bin', function(err, contents) {
console.log("Source file data");
console.log("Buffer size " + contents.byteLength);
let uarray = new Uint8Array(contents);
console.log(typeof uarray);
transmuxer.push(uarray);
});
I've tested with two MPEG-TS files (video and audio pids):
aac.bin - only AAC audio
aac_mp2.bin - AAC and MP2 audio
But no events are raised after the .push call.
That's the reason I've asked in #116
What can be the reason?
All generated MP4s are being created with a starting time of zero and so they're overlapping one another without when fed to a SourceBuffer without manually updating the timestamp offset. I believe we should be setting the base media decode time or inserting an edit list to adjust the start time of media segments generated after the first.
Hi all,
The following URL returns only one sgement with initData
. data
remains empty array.
The file is downloaded as part of DASH video with at yahoo.co.jp.
Hi,
this is a great library, but currently it cannot be used for plain MP4 -> MSE fragmentation. (Am I wrong ?)
This is sad, since there is everything already in it, but the interface does not allow / support it.
Currently, there is only mp4box.js, which can do this but it cannot remux AV tracks into a single MP4 so you have to use multiple SourceBuffers... Bad for AV sync reason.
I'm not very experienced in this regard, so contributing is not so easy for me.
But I think the tasks are not very complicated:
So my interface of dream would be:
let fragmented = fragment(mp4Buffer);
Using events is fine, too. My Mp4 files are small (~5 seconds usually, P2P live video streaming).
I'm willing to contribute if you help me to get started.
This functionality is very important for !!!
Thanks in advance.
Kind regards,
Christopher
Our streams have audio PES packets split between segments - one starts at the end of one .ts file and continues into the next. Calling flush on audio in ElementaryStream (https://github.com/ntadej/mux.js/blob/master/lib/m2ts/m2ts.js#L434) causes player to stall for a brief moment (throbber is shown). Commenting this line out fixes all playback issues with our streams that I reported with new videojs-contrib-hls. (it breaks seeking though)
I'm not familiar enough with mux.js to fix this on my own but I'm happy to provide you any assistance needed to do so. You should already have access to our steams.
Will this facilitate muxing of Vorbis and VP9 files via a browser??
Hi I'm having trouble playing a ts stream (streamed via UDP -> websocket) in Chrome.
Due to the custom setup, the html5 tech is extended to send network packets to the sourceBuffer.
The same video stream plays perfectly in videojs4, but in videojs5 it does not.
[videojs4 setup sends video to the Flash tech; same setup in videojs 5 results in a stuttering video]
The video starts to play but a few seconds into it, it fails with errors.
I've noticed several things going badly in mux.js/mp4/transmuxer.js and currently think somehow the
keyframe-pulling is where it's failing.
gops[0][0].dts = currentGop.dts;
but this causes an error since no frames were pushed and the sub array is undefined.
I looked at
videojs/videojs-contrib-hls#477
#45
and
#57
In my case keyframe-pulling path seems to be where the transmuxer is going and when it does not find any keyframes bad things start to happen.
This brings me to my next question...
Since this is a video stream, all I do is just append to the VirtualSourceBuffer. Each append causes a flush. Issue #45
talks about segmenting gops around each flush call. On the surface it seems to be exactly what is causing at least some of the issues.
Is there anyway outside of the VirtualSourceBuffer to make it flush on keyframe boundaries?
I've noticed that in the flush() calls it does throw away some data until it finds a NAL w/ type 'access_unit_delimiter_rbsp'.
In fact, occasionally I see that it's throwing away 'slice_layer_without_partitioning_rbsp_idr'. Sometimes the NAL type is undefined - is that a problem?
Here are some types that are thrown out on the flush call on entry:
slice_layer_without_partitioning_rbsp_idr
pic_parameter_set_rbsp
undefined
There are also other problems. Some of the errors could be related.
chrome://media-internals/
duration 47721.858833
error media::MediaSourceState::Append: stream parsing failed. Data size=45224 append_window_start=0 append_window_end=inf
event PAUSE
found_video_stream true
info Video codec: avc1.E0EDBE
pipeline_error chunk demuxer: append failed
pipeline_state kStopped
player_id 14
render_id 7
url blob:http://127.0.0.1:8080/5b4f253a-19c6-4137-8097-93af54d327c3
video_codec_name h264
video_dds false
video_decoder GpuVideoDecoder
OR
render_id: 7
player_id: 2
pipeline_state: kStopped
event: PAUSE
url: blob:http://127.0.0.1:8080/d74af027-d491-4cc2-893c-2b3343a82da5
info: Video codec: avc1.EFE6CE
found_video_stream: true
video_codec_name: h264
duration: 47721.858833
video_dds: false
video_decoder: GpuVideoDecoder
debug: Media append that overlapped current playback position caused time gap in playing VIDEO stream because the next keyframe is 760ms beyond last overlapped frame. Media may appear temporarily frozen.
error: media::MediaSourceState::Append: stream parsing failed. Data size=68441 append_window_start=0 append_window_end=inf
pipeline_error: chunk demuxer: append failed
Thanks.
Hello,
I have tried the attached adts encoded file captured from wowza, but no success, could you check what's wrong.
Thanks,
Emin
media-uaj1ie2au_5120.aac.zip
I'm trying to get a ts file to play in the browser. So to learn how, I'm looking at the sample file that comes with mux.js. I load the file, and the video box is just black with a spinning circle. Time is shown as 0:00.
Adding some debug logging shows the codec is avc1.64001f, which should be fine.
If I save the transmuxer output, I get an f4m file that doesn't play. Renaming it to mp4 makes it play in vlc and when I drop it into Firefox. The important thing is that it didn't play from the start though.
I'm using the latest version of Firefox and an older one for testing. Both have the same problem.
Let me know if there's any extra information I can get that would help.
We use mux.js, if present, to convert TS to MP4 in Shaka Player on platforms that don't have native TS support. However, when we do that, we've already parsed the timestamps from TS segments to build an accurate segment index. What we find, though, is that the timestamps in the output are not the same as the input timestamps. See shaka-project/shaka-player#1102 for more.
To support both native TS implementations and mux.js-based transmuxing to MP4, we need everything to be consistent. For our architecture, the timestamps in the MP4 output should be the same as the TS input segments.
Can this be done today with mux.js? Or would this constitute an enhancement?
I have two binary streams, a h264 and aac file, that were extracted from a mpeg-ts segment.
Is it possible to use mux.js to create an mp4 using these, or even directly mpeg-ts ?
My plan is to read the mpeg-ts as bytearray and then use a ts demuxer to extract the aac and h264 streams, and finally, using mux.js for boxing them together in a large mp4 ?
But, is it possible ? π
I was having this issue, videojs/videojs-contrib-hls#761.
But when I remove the following code from line 190 of caption_stream.js:
this.captionPackets_.sort(function(a, b) {
return a.pts - b.pts;
});
The problem resolves itself. Tested in Chrome, Firefox, and Safari.
Can someone explain why this code is necessary? Is this code OK to take out without breaking something else?
If not, are there any workarounds I can implement?
Aspect ratio is wrong on streams that don't have square pixels, but only on Chrome (tested on Win10 and OS X). It works as expected with Firefox and IE11 (on Win10).
I didn't want to duplicate bugs, but this is probably more related with mux.js than videojs-contrib-hls (opened as videojs/videojs-contrib-hls#485).
I am trying to mux a h264 NAL stream to ISO BMFF and need to access CoalesceStream.
I'm using video-contrib-hls in a project that uses browserify to bundle via gulp. I'm getting a compile error:
Error: Cannot find module 'browserify-shim' from '/node_modules/mux.js'
Any idea why this module wouldn't be installed as a dependency when using video-contrib-hls?
I am trying to call caption stream API from mux.js for parsing closed captions for Dash, but I can only get 1 text stream as the output. I tested closed caption parsing with mux.js's debug page, the result was the same. However, when I tested with DASH-IF player it showed two text streams with two languages, eng and swe.
Is there anything special I need to do to get multiple streams?
Test file:
https://vm2.dashif.org/dash/vod/testpic_2s/cea608.mpd
DASH-IF player: http://reference.dashif.org/dash.js/v2.6.7/samples/dash-if-reference-player/index.html
Thank you!
Hi all,
If I got H264 NALs streaming from WebSocket or WebRTC data channel, could I use mux.js to mux them into fragmented MP4 and make browser to play with MSE ?
I think the muxing part should be ready and I only need to feed those NALs into the muxer.
Is it doable ?
Thanks
HLS on Firefox/Win7+8 stutters significantly during video playback. The problem doesn't occur on any other browser or any other OS.
The same stream works on hls.js so I wonder if maybe that specific Firefox build is very strict or specific in terms of its interpretation of MP4 boxes.
After i do remux ts to mp4 at Edge or IE 11 (windows10).
I try to play remuxed video.
Remuxed video play well, but show afterimage when backward or forward seeking.
Sample Url is 'https://video-dev.github.io/streams/x36xhzz/url_6/url_846/193039199_mp4_h264_aac_hq_7.ts'
Please check it.
Thanks
With 'mux.js' everything is Ok. Looks like minified version doesn't expose 'window.muxjs' object
We've diagnosed a timing problem when there are missing frames with a related library (https://github.com/dailymotion/hls.js/issues/409) and it seems that you would be vulnerable to the same problem.
Do you plan to support stts
and other MP4 atoms that are currently empty?
Thanks!
// chrome
Uncaught TypeError: Cannot read property '0' of undefined
// firefox
TypeError: gops[0] is undefined
gops[0][0].dts = currentGop.dts;
Got this while testing my video/hls player build based on video.js and videojs-contrib-hls. *.ts files downloading stops after this error, m3u8 continue to be downloaded
Tested with twitch hls sources. For some streams(even with quality change available) this does not happen, but for some of them this happens for max quality source and does not for lower.
Playback start code, setTimeout with 5s before play() does not change anything here
video.src({
src: this.currentSource.hls,
type: 'application/x-mpegURL'
});
video.play();
chromium, Ubuntu 14.04 x64, same in firefox
same with "videojs-contrib-hls": "^2.1.0-0"
We currently squish together tracks that don't quite match. This shortens the duration of the video which isn't cool. We'll have to figure out another solution.
Repro:
Navigate to https://v2-4-0-dot-shaka-player-demo.appspot.com/demo/#asset=https://s3-ap-southeast-1.amazonaws.com/learnyst/testfolder/sample/expresshls/hls.m3u8;play and start playback.
For the first 2 segments (https://s3-ap-southeast-1.amazonaws.com/learnyst/testfolder/sample/expresshls/zh3il5009f-yxhbaADHqLeCAo3k.ts and https://s3-ap-southeast-1.amazonaws.com/learnyst/testfolder/sample/expresshls/ZtV37gCCgR5cnwUrQliZEPG9zIU.ts) the callbacks get invoked, but not for the third one (https://s3-ap-southeast-1.amazonaws.com/learnyst/testfolder/sample/expresshls/X3cf3WEJqIG6EfX0Yjq8dvGthAc.ts) although they are present in the transmuxer's internal listeners collection.
The code that registers the callbacks:
this.muxTransmuxer_ = new muxjs.mp4.Transmuxer({
'keepOriginalTimestamps': true
});
this.muxTransmuxer_.on('data', this.onTransmuxed_.bind(this));
this.muxTransmuxer_.on('done', this.onTransmuxDone_.bind(this));
Sometimes audio or video packets can be padded with filler bytes to maintain a specific bitrate. When the TS Probe is inspecting packets to determine the audio/video end time of a segment, it starts with the last packet and grabs the timestamp of the first audio/video packet it sees. If the segment has been padded with filler bytes, the packet may not contain a timestamp causing NaN
issues. The probe should continuing searching backwards for the first packet it sees and can correctly parse a timestamp from.
Hi,
I am using mux.js as a part of video-contrib-hls.js. I was facing an issue with the following stream:
http://demo.unified-streaming.com/video/tears-of-steel/tears-of-steel-sample-aes.ism/.m3u8
I get the exception "Cannot read property segments of undefined" . This happens in the function "processPendingSegments()_" in video-contrib-media-sources.
It seems to be due to the fact that the mux.js is not returning proper data to it, the cause for the specific exception is that the the segment.type is returned as combined, where as it expects it to be audio or video.
I have observed this issue with one more stream which is also non-multiplexed. I couldn't find more non-multiplexed HLS streams apart from these two to confirm it further. Also I haven't observed this with any of the multiplexed streams.
Is this a known limitation/expected behavior of the module? Has anyone else observed this issue or am I missing something here?
Versions used:
mux.js - 3.0.4
video-contrib-media-sources - 4.1.2
video-contrib-hls - 4.0.2
video.js - 5.11.9
Browser used: Google Chrome version 55.0.2883.87 m
Regards,
Archit
This will ensure that B-frames do not interfere with caption timing.
Hi, first good job with this mux.js. Second, I have a chunk of 10 sec of .ts with 1 Video track and 3 Audio tracks and I am using your debug site. I have one error : mux.js line 4090 Uncaught TypeError: Cannot read property '0' of undefined. The same kind of video with 1 video 1 Audio is working just fine. Can you help me?, please!
When I try to generate the mp4 file by use mux.jsοΌ I got wrong duration time in generated mp4 file, the result time is aways 0xffffffff
=> 13:15:24
γ It's a default value:
mux.js/lib/mp4/mp4-generator.js
Line 613 in 55c1d56
How to calculate the correct duration according the MPEG2-TS packets by use mux.js? Or How I change the correct duration before call the method mp4.initSegment(this.pendingTracks);
in lib/mp4/transmuxer.js#L835-L842 or earlier
i added an event listener with some code like this:
transmuxer.on('data', (event) => {
if (!remuxedInitSegment) {
remuxedInitSegment = event.initSegment;
// I also want to change the duration time by use remuxedInitSegment.set([xxxx], index)
// But i don't know how to find the right index.... :'(
}
})
I'm including this via https://github.com/videojs/videojs-contrib-hls and I'm finding that dist/test on this module alone is adding ~58k lines to our repo! Is there anything we can do about this?
It would be really cool to have the ability to specify on the debug page if a segment you are loading in is also a discontinuity to observe the output of the muxer after setting baseMediaDecodeTime
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.