jean343 / node-openmax Goto Github PK
View Code? Open in Web Editor NEWNode wrapper for the OpenMAX library
Home Page: https://www.npmjs.com/package/openmax
License: MIT License
Node wrapper for the OpenMAX library
Home Page: https://www.npmjs.com/package/openmax
License: MIT License
I am currently struggling to get the camera component working. Could you provide a small example how to set it up?
I've been meaning to write yet-another-mediaplayer for Raspberry Pi, but I wanted to use node and electron/nw for it. I guess this goes a long way to help that be a feasible idea, since video decoding using the web internal sucks otherwise.
So I'm definitely going to look into this, thanks!
First, great library and thank you for writing this! Using this library in C++ is painful even if it is a necessary evil at times.
Have you tried this with streaming H.264 video data? My use case for this library would be streaming video part of a wireless meeting project. So, I am sending out the individual NAL units of encoded data until the entire frame is sent then it is rendered. Would this project be able to handle that kind of data flow? If so, is there an example?
Thank you.
Hi Jean,
first of all thanks for this great project!
I am trying to play many videos in sequence. The problem is that resources used by VideoDecode/VideoRender do not get released so Node crashes after some videos. In Node's output I can see the COMPONENTTYPE() outputs when the constructor is called, but destructors are never called.
What's the best way to achieve releasing resources after a file was played in the 'finish' event?
Hi,
I'm trying to get JPEG images to display to screen, but my code just seems to hang somewhere after the image decode is initiated and nothing appears on screen. Should the following work?
"use strict";
var fs = require('fs');
var omx = require('openmax');
var ImageDecode = new omx.ImageDecode();
var VideoRender = new omx.VideoRender();
omx.Component.initAll([ImageDecode, VideoRender])
.then(function () {
ImageDecode.setInputFormat(omx.IMAGE_CODINGTYPE.IMAGE_CodingJPEG);
fs.createReadStream("frame_01.jpg")
.pipe(ImageDecode)
.tunnel(VideoRender)
.on('finish', function () {
console.log("Done");
process.exit();
});
});
Thanks,
Chris.
Hi, this code is fantastic! Thanks for building it.
I'm wondering if it's possible to use the VideoDecoder with a raw RGB888 format and then push this to the renderer? Looking at the docs for OpenMax, I think it is possible - it looks like it should be just a case of changing the flag in the following example:
VideoDecode.setVideoPortFormat(omx.Video.OMX_VIDEO_CODINGTYPE.OMX_VIDEO_CodingAVC);
But, I can't find anywhere that defines the flag for RGB888, or even BGR888, ARGB, etc. What should I set it to, or isn't this possible?
As an alternative, what could I send directly to the VideoRender object and bypass the VideoDecoder? Is it YUY2?
Thanks,
Chris.
Hi Jean,
I've been trying to use this library without success. I'm pretty sure I'm being dumb but can't figure out what :).
How do I run the examples? When I run
node examples/SimpleJPEGToVideo.js
I get "cannot find module ../" presumably this is because index is a typescript file at this point? ( This could be rubbish )
I've also tried to include openmax and run the ImageDecoder but when I do so I get
agricamifu:APP TypeError: this.init is not a function at Object.ImageDecode (/srv/opt/agricameraifu/lib/node_modules/openmax/dist/lib/components/ImageDecode.js:13:14) at VideoStream._write (/srv/opt/agricameraifu/lib/camerarecorder/motionDetector.js:91:36) at doWrite (_stream_writable.js:292:12) at writeOrBuffer (_stream_writable.js:278:5) at VideoStream.Writable.write (_stream_writable.js:207:11) at MotionStream.ondata (stream.js:31:26) at emitOne (events.js:90:13) at MotionStream.emit (events.js:182:7) at MotionStream.sendFrame (/srv/opt/agricameraifu/lib/camerarecorder/jpegMotionDetector.js:156:10) +169ms
Which is caused by this line:
` that.imageDecode = omx.ImageDecode();`
Any help would be greatly appreciated.
Thanks
Alan
How could I contribute to this project?
I have made a simple example which allows to stream the camera frames into pngs very quickly. there are other applications online, but they are either bulky or very slow.
So I tried the PerfRender
file you suggested to test out scaling and such, and that works! ๐
I am running into a few issues, though:
In its original configuration PerfRender.js
was set to 3 rows and 4 columns. My screen isn't HD, so I had to set that to 2x2, otherwise I got a black screen after the first 2 renders started drawing (some kind of "out of bounds" issue?)
PerfGLGrid.js
is only showing a blue screen. The texture
property of all the ws (WritableFilter)
instances is always undefined
:/ (And yeah, I also changed the rows & columns here)
When I try PerfGLGrid.js
a few times in a row (only ever getting the dark blue screen), I get an Error: OMX_GetState() returned error: OMX_ErrorInsufficientResources
. PerfRender.js
will now only show 1 video, instead of 4. To get everything back to normal again I have to reboot the pi.
Also:
SimpleVideoDecoderRender.js
plays the video 'till the endSimpleVideoDecoderRenderTunner.js
stops just about after the "presents..." title, never exiting the script. Seems like it's caused by tunnel
-ing the data into the renderer instead of using pipe
This happened while running the module here: https://npm.runkit.com/openmax
Error: Could not locate the bindings file. Tried:
โ /app/available_modules/1511633220000/openmax/build/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/build/Debug/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/build/Release/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/out/Debug/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/Debug/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/out/Release/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/Release/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/build/default/Node_OMX.node
โ /app/available_modules/1511633220000/openmax/compiled/8.9.0/linux/x64/Node_OMX.node
In the readme you wrote Use the OMX OMX_GetParameter and OMX_SetParameter to change eCompressionFormat to
:
var format = VideoDecode.component.getParameter(VideoDecode.component.in_port, omx.Index.OMX_INDEXTYPE.OMX_IndexParamVideoPortFormat);
format.eCompressionFormat = omx.Video.OMX_VIDEO_CODINGTYPE.OMX_VIDEO_CodingAVC;
VideoDecode.component.setParameter(VideoDecode.component.in_port, omx.Index.OMX_INDEXTYPE.OMX_IndexParamVideoPortFormat, format);
I tried to do this in the SimpleVideoDecoderRenderBuffer.js
file, after the initAll
call:
omx.Component.initAll([VideoDecode, VideoRender])
.then(function () {
VideoDecode.setVideoPortFormat(omx.VIDEO_CODINGTYPE.VIDEO_CodingAVC);
// out_port is undefined?
console.log('out_port:', VideoDecode.component.out_port);
var format = VideoDecode.component.getParameter(VideoDecode.component.out_port, omx.Index.OMX_INDEXTYPE.OMX_IndexParamPortDefinition);
console.log('FORMAT:')
console.log(format);
return
fs.createReadStream("../../spec/data/video-LQ.h264")
.pipe(VideoDecode)
.pipe(TransformFilter)
.pipe(VideoRender)
.on('finish', function () {
console.log("Done");
process.exit();
});
});
But as you can see in the comment, out_format
is undefined. What am I doing wrong here?
In case you're wondering: I'm trying to change the scale & position of the video being rendered.
While that probably has to happen in the VideoRender
(using OMX_IndexConfigCommonScale
perhaps?) I'm just wondering why I'm having trouble with this parameter.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.