GithubHelp home page GithubHelp logo

samirkumardas / jmuxer Goto Github PK

View Code? Open in Web Editor NEW
537.0 537.0 105.0 105.78 MB

jMuxer - a simple javascript mp4 muxer that works in both browser and node environment.

License: Other

HTML 5.48% JavaScript 94.52%
aac aac-player cctv-player chunk drone h264 h264-live-player h264-player hls html5-mp4-player javascript-mp4-muxer javascript-mp4-player jmuxer media-source-extension mp4 mp4-muxer mse raw-h264-player rtsp video-streaming

jmuxer's Introduction

Hi there 👋

Zlienqwe's github stats

Top Langs

jmuxer's People

Contributors

bp2008 avatar caa07007 avatar dependabot[bot] avatar etenoch avatar fralonra avatar jasinyip avatar marcoxbresciani avatar samirkumardas avatar spencerflem avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jmuxer's Issues

Calling jmuxer.destroy after providing no frames can cause "DEMUXER_ERROR_COULD_NOT_OPEN"

I've been using jmuxer to great effect in https://github.com/bp2008/ui3 but kept experiencing this intermittent error in Chrome. Once in a while when transitioning from one video stream to another, the HTML5 video element would throw a MediaError (via the video element's "error" event). The MediaError was always code 4 (MEDIA_ERR_SRC_NOT_SUPPORTED) with message "DEMUXER_ERROR_COULD_NOT_OPEN".

Today I tracked down the cause. It happens when I destroy the jmuxer instance before feeding the first frame to it. It is a little tricky to reproduce, because for the error to occur, a little time must pass between creating the jmuxer instance and destroying it. Possibly the Media Source needs to be ready first (it loads asynchronously). I noticed if I comment out this.mediaSource.endOfStream(); in jmuxer's destroy method, the MediaError won't be thrown.

I'm not sure what the ideal solution is for this. I am working around the problem by not destroying jmuxer instances which haven't been fed any frames yet.

Media resource blob could not be decoded

Hi,

I'm trying to stream rtsp cameras through a rust + gstreamer backend, to the browser and play it using jmuxer.

I've gotten it to a point where a first couple of frames appear in the video element but then it stop with the following error:

Media resource blob:http://localhost:3000/c75f8714-b948-4f15-a675-240fd1cfe08a could not be decoded.
Media resource blob:http://localhost:3000/c75f8714-b948-4f15-a675-240fd1cfe08a could not be decoded, error: Error Code: NS_ERROR_DOM_MEDIA_FATAL_ERR (0x806e0005)

this is with the following config in the browser in firefox 80.0 (chromium does not give any error but video freezes as wel):

var jmuxer = new JMuxer({
  node: 'camera',
  mode: 'video',
  debug: false,
  flushingTime: 2,
  clearBuffer: false,
  fps: 30,
});
var ws = new WebSocket("ws://127.0.0.1:9000");

ws.binaryType = 'arraybuffer';

ws.addEventListener("message", (msg) => {
  jmuxer.feed({{  
    video: new Uint8Array(msg.data),
  });
});

If I set clearBuffer to true I also get these errors (alot):

Uncaught DOMException: An attempt was made to use an object that is not, or is no longer, usable jmuxer.js:2100
    initCleanup http://localhost:3000/jmuxer.js:2100
    clearBuffer http://localhost:3000/jmuxer.js:2506
    interval http://localhost:3000/jmuxer.js:2456
    (Async: setInterval handler)
    startInterval http://localhost:3000/jmuxer.js:2452
    JMuxmer http://localhost:3000/jmuxer.js:2230

Is this something you recognize? I'm a bit worried it could be the h264 encoding itself (not the best camera's) however I have gotten them streaming pretty reliable using hls.js. So a browser should be able to play it.

If you need any more info let me know,

Thanks in advance!

cannot feed data with correct raw samples

Hi

I want to feed data from web socket - 1 nal unit at a time from ws proxy server which forwards rtp stream.

//index.html
` this.wsSocket = new WebSocket("ws://localhost:9999");
this.wsSocket.binaryType = "arraybuffer";
this.wsSocket.addEventListener("message", this.feed);

feed(event) {
  let data = new Uint8Array(event.data);
  jmuxer.feed({
    video: data,
  });
},`

I'm sending this sequence which works properly in broadway.js decoder.

// example payloads from server
<Buffer 00 00 01 21 9a 10 3c 00 70 87 c0>
<Buffer 00 00 01 21 9a 12 3c 00 70 87 c0>
<Buffer 00 00 00 01 27 42 80 28 95 a0 14 01 6e c0 40 78 91 35>
<Buffer 00 00 00 01 28 ce 02 5c 80>
<Buffer 00 00 01 25 88 80 00 01 ea b4 ff ff fc 3d 14 00 0d ff ff fe 4f ff fc 9f ff f9 3f ff f2 7f ff e4 ff ff c9 ff ff 93 ff ff 27 ff fe 4f ff fc 9f ff f9 3f ... 10315 more bytes>
<Buffer 00 00 01 21 9a 02 3c 00 70 87 c0>
<Buffer 00 00 01 21 9a 04 3c 00 70 87 c0>

I don't have any error message just no image. After quick debug session I only found that no SPS and PPS frames are picked. Are they essential to get the stuff working or is fps option sufficiant?

I've read your answer regarding reading sps and pps, which isn't read in my example when putting breakpoint on parseNAL().
#36 (comment)

correct type is picked if reading ntype on nalu.js constructor.

This looks my converted buffer which is feeding:
Uint8Array(17) [0, 0, 1, 39, 66, 128, 40, 149, 160, 20, 1, 110, 192, 64, 120, 145, 53] - SPS
Uint8Array(8) [0, 0, 1, 40, 206, 2, 92, 128] - PPS

Tool to split videos?

Is there a tool that I can use to create chunks in the proper format needed for server.js? Also, which kind of video/audio codecs and container should I use?

Firefox: NS_ERROR_DOM_MEDIA_FATAL_ERR

With the latest version of jmuxer, I'm seeing Firefox (version 61.0.1 64 bit on Win10) throw this MediaError every time I destroy a jmuxer instance now after feeding it some frames.

The error code is 3 (MEDIA_ERR_DECODE)
The error message is:

NS_ERROR_DOM_MEDIA_FATAL_ERR (0x806e0005) - class RefPtr<class mozilla::MozPromise<class mozilla::media::TimeUnit,class mozilla::MediaResult,1> > __cdecl mozilla::MediaSourceTrackDemuxer::DoSeek(const class mozilla::media::TimeUnit &): manager is detached.

This did not occur with a jmuxer release from 2 months ago.

Recommended usage for minimizing memory usage

I've got 3 instances of JMuxer running on a webpage (security system). After 1hr of usage, the page's memory usage starts to approach 1gb.

After 2 hours of usage: screenshot
(The high CPU usage in the screenshot is a result of me running video.src = '' as a test, not part of the bug here).

The 1.6gb of memory in the screenshot is not a result of anything being stored in memory by the script (the page only consumes about 7mb of memory in heap), but I suspect it's because MSE is storing a large amount of buffered video in memory. Running console.log(video.buffered.end(0) results in 7500 or around 2hrs of buffered content being held in memory, per instance. And I'm able to seek using video.currentTime = 3000

This is my init:

        const muxer = new JMuxer({
            node: videoElement.current,
            mode: 'video',
            flushingTime: 0,
            clearBuffer: true,
            fps: 12.8,
            debug: false
        })

If this isn't a bug, would you happen to have any suggestions regarding how to manage memory usage?

1920x1080 stream with choppy video

Hi Samir, thanks for the great lib!

I have recenlty played with jmuxer, but experienced very choppy video when used with ffmpeg desktop streaming and fullhd video. I put together a small demo to showcase:

demo.js (node demo.js)

const WebSocket = require('ws');
const spawn = require('child_process').spawn;
const shell = "ffmpeg -f gdigrab -framerate 30 -video_size 1920x1080 -i desktop "
	+ "-an -c:v h264 -preset ultrafast "
	+ "-f rawvideo -";

new WebSocket.Server({port:8080}).on('connection', function(ws) {
	spawn(shell, [], {shell:true}).stdout.on('data', data => {
		ws.send(data);
	});
});

demo.html

<!DOCTYPE html>
<html>
<head><script type="text/javascript" src="jmuxer.js"></script></head>
<body><video id="player" width="640" height="480" controls="false" autoplay muted></video></body>
<script>
window.onload = function startup() {
	const jmuxer = new JMuxer({node:'player', mode:'video', flushingTime:1});
	const ws = new WebSocket("ws://localhost:8080/");
	ws.binaryType = "arraybuffer";
	ws.addEventListener('message', event => {
		jmuxer.feed({video:new Uint8Array(event.data)});
	});
}
</script>
</html>

the rendered video looks as following:
Clipboard01

When feeding the same stream into ffplay ffmpeg -f gdigrab -framerate 30 -video_size 1920x1080 -i desktop -an -c:v h264 -preset ultrafast -f rawvideo - | ffplay -i - it renders just ok.

Do you have any idea what can be done to make the redering complete?

It's me again!

hi. it's me again! i'm sorry .
Your project is running well, but there is a problem with delay. It is necessary to collect enough frames to display these pictures continuously, and not to display them in one frame or one frame. My project is not a continuous video stream, so sometimes there is only one frame that needs to be displayed, so your library can't be displayed.
can you help me please ?

Excellent demo

But can i ask how to serve RTSP server as a WSS protocol service?

colors are wrong, support for YUV420J?

I just tried out this library and it seems to work great, except that the colors are wrong. I am not using JS or FFMPEG to encode the image, I am using Jcodec in Spring and converting my RGB source to YUV420j before encoding. Did anyone experience anything similar?

I checked if my input is not BGR, which doesn't seem to be the case. What colorspace is jmuxer expecting, do I need to convert something first?

Thanks in advance.

Extract individual frames as images from h264 stream

Hi, first of all thank you for creating this amazing lib!

I have a usecase where I need to feed an h264 stream and get individual frames back as images. Is it possible to do it with jmuxer? do you know otherwise a way that I can play with the lib to get the needed information?

(I was already able to feed the stream into a video tag)

Thank you.

Why can't change frame?

my websocket is raw h264 frame data,only show first frame,but can't show the back of the frame?

When H264 decode error happen,it can't contine to decode and show video later

the websocket url is ws://183.63.123.68:8071/4&type=H264_RAW is error
and ws://183.63.123.68:8071/3&type=H264_RAW is ok

below is index.html

<title>JMuxer demo</title>

jMuxer Demo

Sample demo node server is running on heroku free hosting



<script> var jmuxer; var ws; function parse(data) { var input = new Uint8Array(data), dv = new DataView(input.buffer), video; video = input.subarray(16); // here is different
	return {
	  video: video
	};
}

window.onload = function() {
	jmuxer = new JMuxer({
		 node: 'player',
		 flushingTime:150,
		 mode:'video'
	  });
	ws = new WebSocket('ws://183.63.123.68:8071/4&type=H264_RAW');
	ws.binaryType = 'arraybuffer';
	ws.addEventListener('message',function(event) {
		 var data = parse(event.data);
		 jmuxer.feed(data);
	})

}
</script> <script type="text/javascript" src="jmuxer.min.js"></script>

Managing video duration slider

Using duration of each single frame, is it possible to move the slider of <video> control so that I can move it back?

Invalid ADTS audio format

Input object must have video and/or audio property. Make sure it is not empty and valid typed array

I'm getting this error when running server-h264.js

The player does stream though with server.js.

I'm using Windows 10 if that matters.

Could I feed frames discontinuously?

Client accepts frames from websocket , and I have to do some operations to get more frames. However, the video tag seems to refuse playing new frames when I do some operations.

Will it be easy to support web worker?

At first I want to say thank you for your sweet lib which really helped me a lot.
What I want is to display several 1080p videos in the page at one time.
I works well when there's only one. I checked the system monitor and found the cpu usage of one core was full. I tried to do some components ways to let the instance of jumxer runs in a iframe, but it didn't work.
It that easy to let JMUXER support web worker or give me a way to inspire me to let me do it myself ?
I know I have no rights to ask you to do anything extra for me...Just want to say thank you. What you did has already helped me so much.

play only audio aac is still waiting

now, I use the jumxer to play audio with aac, not video data. play the audio data when at the 53:27, the audio player is still waiting, now I found have enough data to play in buffer. I try to other data, still waiting status in 53:27, the player readState is 2, but buffer have data .I can not solve this.

Safari Memory Usage

Hello everyone,

First of all @samirkumardas thanks for this great library. I just implemented your library to play raw H264 frames. It works perfectly. I am getting data from a web socket and feeding the video element with jmuxer. There is no problem with Firefox and Chrome. But When I try to use safari, memory usages is getting increase so much. I realized that Chrome and Firefox clear video when I checked the video element control, the duration is always between 2 and 3 seconds. But in Safari, it is increasing.

Do you have any suggestions to make? I tried to debug it but I couldn't find any clue about this issue.

A strange question

It didn't run before because I didn't call the play method of video element. After the normal operation, I open your debug mode, and when first log prints out (video remove range [0 - 6.097455)), this sentence stops running and doesn't play .

Improvement - allow "live" streaming by ditching data

I would like to offer an improvement to support more real live streaming.

Assume the following :

  1. jMuxer is getting its data from some external source via WebSocket.
  2. jMuxer refreshes frequently and with constant 60 fps.
  3. the

The problem:
While all works fine and well, changing Tabs in chrome (for example) will cause the

Suggestions: (one of the following i assume)

  1. Skip to the end of the video (flag) when tab gets focus again /
  2. Ignoring any "data" (not config) NAL Units coming when tab isn't focused.
  3. Not appending any data (again not config) to the MSE while the

These are just some of the suggestions came to my mind, i would be happy with anything that solves that "problem".

Many thanks.

Chunk extraction algorithm

In the server-h264.js demo, the chunks are sliced so the headers are at the end of the chunk, instead of the beginning, is this normal (they're essentially footers, instead of headers)?

This seems also to be the case for the NALs inside /raw/.

In my personal project, I'm transmitting the NALs with the 0x000001 or 0x00000001 constants at the beginning of the NAL unit (as headers), but nothing is rendered (also no errors).

I'm using H264 with baseline profile, and I'm including PPS and SPS headers on every I-frame.

Video is really choppy once fed to jmuxer

I created 1 second segments of an mp4 file. Concatenating them using ffmpeg works fine, however, feeding them into jmuxer results in really choppy playback. Although, 2 second segments work fine, but I really really want 1 second segments.
My guess is that the segments are over lapping so it goes

[ 0=1=2=3=4=5 ] [=3=4=5=6=7=8=]
     seg 1           seg 2

If that makes sense at all?
Thanks!

jmuxer with h264 and opus

Hi Samir, first I want to congratulate you for developing this tool. It's awesome. The reason of this issue is to ask you if it is possible to adapt jmuxer so it can receive opus audio instead of AAC. I think this feature would have an incredible utility because we will be able to send the mediaStream from the video element straight through WebRTC which doesn't support AAC right now.

Latency

Hello,

Firstly, your project is the only thing I have managed to get working for the scenario I need. Basically I am encoding camera frames into H.264 using lib/ffmpeg in C/C++ and pushing those over websockets to a browser client. I originally tried to push mp4 container data directly into MSE but with no luck so far.

The problem I am facing is there seems to be too much latency (even if I am directly connected over Ethernet). I am not sure if the delay is on the C++/encoding side, websockets, or the remuxing/browser side.

Do you recall what kind of latency you were able to achieve with jmuxer?

PS. I have set flushingTime to 1ms which helped from the default of 1500, but there still seems to be at least 1 second of latency or more (I am yet to profile to get actual value, I am just going off the visual latency I see).

\jmuxer\example\jmuxer.min.js code

Hello, maybe it's very presumptuous, but can I look at the uncompressed version of the \jmuxer\example\jmuxer.min.js? The compressed version of the readability is too low.
Thank you very much.

Wrong colors on Android emulator streaming. (API 30)

Hi, first of all thanks for this amazing project, its a gem.

Secondly, I am using "scrcpy" to stream Android devices data (Raw H264 NAL Units) through a WebSocket.
There is an issue that for some emulators (and maybe devices, haven't checked yet), the picture shows in wrong colors.
The emulator below is a Pixel 2 API 30, another device that showed this issue is Pixel C API 30
Same devices with API 28 works fine.
All these emulator works fine on the original scrcpy client (that works a bit differently in terms of showing the raw data).
image

image

Decoding artefacts

First of all, thanks for this great library, I've got it working for one of my projects for live streaming video data. There's a couple of instances where I'm seeing a lot of decoding artefacts like this:

I'm not seeing any errors on the console or in chrome://media-internals. The data that generally exhibits this issues are non-standard resolutions, e.g. the screenshot above was from video at 42x90. For video at 1680x1050 everything works great.

I've tried passing the raw h264 data (annex b, baseline) through ffmpeg to generate an mp4. This generates a video without any of the same encoding issues shown above, so I think the h264 data I have is good. I'm using this command for ffmpeg:

ffmpeg -framerate 10 -i /tmp/frames/all.h264 -c copy ./output.mp4

Any idea how I can resolve these decoding issues? Happy to provide sample data if that's helpful.

H265 not compatible

It's there any way to make this library work for h265, or else can you suggest h265 library for muxing like JMuxer. And JMuxer works brilliantly, thank you.

x265 support

I was wondering if it’s possible to add x265 support, since the file format is very similar.
Also can this project have multiple audio tracks?

Minimal latency with a WebSocket raw H264 stream ?

Hello,

What about the minimal latency of a player implementation if I use a WebSocket raw H264 stream ?
With Broadway it's very low, but it consume more CPU power than a native HTML5 player.

Pascal

play only audio aac will auto pause

play only audio aac will auto pause,when controller/buffer.js:41 doCleanup() ,it will pause,and remove doCleanup,it's ok ;
the aac format is channel = 1 freq=8000

I think is not necessary to call doCleanup

Live streaming with jmuxer

You mentioned it was purpose of developing this project to make this possible:

It was needed to play raw H264 and AAC data coming from live streaming encapsulated into a custom transport container in a project

I would like to use a live stream from an enigma2 receiver source which supports H264/AAC/ADTS.

Are there any examples publicly available on how to use a stream resource instead of a static file or static chunks?

Anyway, thank you very much for your great work!

I'd like to ask you a question

Is it possible to realize the frame by frame transmission of h264 code stream through websocket, and each frame is transferred to jmuxer

without audio,it can't play

In docs/index.html

function parse(data) {
var input = new Uint8Array(data),
dv = new DataView(input.buffer),
duration,
audioLength,
audio,
video;

  duration = dv.getUint16(0, true);
  audioLength = dv.getUint16(2, true);
  audio = input.subarray(4, (audioLength + 4));
  video = input.subarray(audioLength + 4); 
  return {
    audio: audio,
    video: video,
    duration: duration
  };

when it return {video:video,duration:duration},it can't play the video

Play live video delay

1、Cannot play when the first frame data is I-frame
2、There is a delay in playing the video in real time, almost 2 seconds delay, but I can't find out where the problem is.

Can not play without audio

create jmuxer like this:
var jmuxer = new JMuxer({ node: 'player', mode: 'video' });
but still get error : Input object must have video and/or audio property. Make sure it is not empty and valid typed array
my browser is : Google Chrome 80.0.3987.122(windows)

how to use with ffmpeg mpegts websocket

Hi, i'm working on a project where i use ffmpeg to stream mpegts segments to my local server thant send to frontend via websocket (using mpegjs)

I want to switch to jmuxer to check if performance are better. can you help me in convert video and audio and pass to websocket?

at the moment i'm using this command for ffmpeg

-y -threads 8 -thread_queue_size 8096 -re -f rawvideo -vcodec rawvideo -pixel_format bgra -video_size 1024x576 -i pipe:0 -i ${payload.source} -filter_complex [1:v][0:v]overlay[out] -map [out] -vcodec libx264 -map 1:a:0 -f mpegts -codec:v mpeg1video -b:v 700k -q 1 http://localhost:3000/mystream

basically i send raw frames to ffmpeg pipe and put on overlay to a video and send it on mpegts using mpeg1video as codec to my localhost server that send data to websocket.

Someone did ever someting similar using jmuxer? as i understand i need a way (codec) that translate in jmuxer packet format

2 bytes 2 bytes    
Duration (ms) Audio Data Length Audio Data (AAC) Video Data (H264)

thanks
Andrea

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.