opendatacam / opendatacam-mobile Goto Github PK
View Code? Open in Web Editor NEWOpenDataCam mobile app for android
Home Page: https://play.google.com/store/apps/details?id=com.opendatacam
License: MIT License
OpenDataCam mobile app for android
Home Page: https://play.google.com/store/apps/details?id=com.opendatacam
License: MIT License
Starting to gathering some research / progress on github issues.. For future reference + help me to organize craziness in my head.
After benchmarking plenty of example apps / framework etc etc ... I came the conclusion at this time of writing / state of my knowledge that the fastest and most portable way to run YOLO on Android (and iOS if Android app sucessfull) is to use the NCNN framework from Tencent: https://github.com/Tencent/ncnn . Tencent being a company of the scale of Google in china.. The License is MIT.
There is a very nice example app showcasing several neural networks : https://github.com/cmdbug/YOLOv5_NCNN
Think of the NCNN as a framework to run a neural network (like darknet or tensorflow or pytorch).. but super optimized for Mobile phones CPU inference..
It is very very very optimized for android and iphone cpus ... I get 17 FPS for YOLOv4-tiny on a Xiaomi mi 8 (200β¬ phone).. for example with Tensorflow lite I think I get 2 FPS so this is crazy magic... but the interesting thing is that it also aims to support lots of platforms https://github.com/Tencent/ncnn#supported-platform-matrix , maybe to watch for the future also to run on Raspberrys, jetsons, web... Right now it is not very performance on GPUs.. but they are working on it. Another very impressive demo is that you can ship NCNN on the web via webassembly: https://github.com/nihui/ncnn-webassembly-nanodet (here it is running Nanonet, a new lightweight yolo like neural network that is almost as accurate but faster: https://github.com/RangiLyu/nanodet ..
The other super great news of being able to run YOLOv4-tiny on mobile is that you can train custom weights the same way and then convert them to NCNN "compatible" weights.. + we already know that YOLOv4-tiny is accurate enough.
The code from https://github.com/cmdbug/YOLOv5_NCNN is licensed GPLv3 but I think the actual YOLOv4-tiny C++ code that is the only part we need is available as MIT code here: https://github.com/Tencent/ncnn/blob/master/examples/yolov4.cpp .. So this is to be investigated to determine the future license of OpenDataCam mobile
Here is how to integrate it on Android:
This sounds super easy π, but I spent the last 2 weeks to really understand how to do this in practice.. by studying the code of the https://github.com/cmdbug/YOLOv5_NCNN , Android app development and the Cameras APIs..
The good news is that I have it mostly figured out π.. I ended up doing my own "glue" code using the latest version of the CameraX api which simplifies a bit things.. and also supports things that are not supported in the example app.. like Camera/ Device orientation (was only working on portrait).. it is still a bit buggy though, the coordinates of the boxes are a bit off.. there is still some aspect ratio magic I need to figure out
I also put together a working Webview bridge using Capacitor (https://capacitorjs.com/) , and I'm able to render a HTML canvas on top of the Camera Preview which draw the boxes...
The whole things seems as performant as the Android native demo app.. so I guess the "core" Proof of concept is mostly under control now... Demo to try soon !
Right now we hardcode the path to the app in android with this for NeDB and for Next.js
/data/data/com.opendatacam
This works if the app is installed on the internal storage.. but I think this can fails if installed on the SD card.
Need to so how to make this robust.
Hi @tdurand! Very nice! Here is some feedback for the nitty gritty details :)
General
Welcome
Live view
Counter
Data
Menu
Build command failed.
Error while executing process C:\Users\RD\AppData\Local\Android\Sdk\cmake\3.10.2.4988404\bin\ninja.exe with arguments {-C D:\AndroidStudioProjects\opendatacam-mobile\android\app.cxx\cmake\debug\arm64-v8a nodejsmobile yolov4}
ninja: Entering directory `D:\AndroidStudioProjects\opendatacam-mobile\android\app.cxx\cmake\debug\arm64-v8a'
ninja: error: 'D:/AndroidStudioProjects/opendatacam-mobile/android/app/src/main/cpp/libnode/bin/arm64-v8a/libnode.so', needed by 'D:/AndroidStudioProjects/opendatacam-mobile/android/app/build/intermediates/cmake/debug/obj/arm64-v8a/libnodejsmobile.so', missing and no known rule to make it
e.g. phone is on 192.168.0.157 ... now I would have expected that I can open https://192.168.0.157:8080/ in a browser on a different device and enjoy ODC there too. I guess due to the tight integration with Android e.g. cam preview this is not possible, or?
To benchmark https://github.com/xiang-wuu/ncnn-android-yolov7
Once the first alpha / beta is ready.. gather a list of testers and build list of devices "tested" with the FPS performance for each of them in a table.
I think Google play has some nice way to do this.. where I just push a new build of the app and it gets distributed to beta tester list which a form to fill with feedback.
hi,
i need to used external usb camera for input. can you share where can i tweak. thank you
This will be Open Source but need to figure out under which LICENSE .. Mostly depends on #2 ...
Righ now it is a bit blurry.. needs to work out the devicePixelRatio
and the ctx.scale()
but it's not straightforward with the current implem
Problem
Life is unfair π .
I was happily running OpenDataCam and playing with it in the street.. and after 2 min I noticed some FPS drops (from 20-17 FPS to 15 FPS).. and then from 15 to 12, and then from 12 to 9 FPS (this is 10 min time), it was like 25ΒΊC temperature in the afternoon
My first though was "shit" Node.js dependency is affecting long term run.. but then I did the test with the "native" benchmark app.. and got the same problem
The issue is caused by CPU thermal throttling . which is well known to game developer on mobile also.. The device overheats and then the CPU need to throttle to avoid damages..
I ran a benchmark over 15 min time: https://play.google.com/store/apps/details?id=skynet.cputhrottlingtest&hl=fr&gl=US this morning (temperature more like 16ΒΊC outside), and it holded up way better...
What can we do about it ?
Test , test and test and get an idea of the real performance achievable depending on outside temperature / cooling / CPU raw performance ..
Add alert to the user in the app that the performance are dropping cause of CPU throttling and he should cool the device to keep max perf.. there is an API for this: https://developer.android.com/ndk/reference/group/thermal
Investigate NCNN settings / android settings .. is it possible to have a "constant FPS" mode that does not use the CPU full speed but rather prioritize to be consistent for several hours etc etc.. Tencent/ncnn#1901 (comment) ??
Made some test running opendatacam-beta-2.apk over hours ... with Xiaomi Redmi Note 9 Pro.
Good news is that the phones doesn't get very hot even over several hours without a fan attached. However I still couldn't figure out how to run the ODC app for days without having the FPS going down.
Time | FPS |
---|---|
Fri 16:00 | 10 |
Fri 16:30 | 8-9 |
Fri 17:40 | 9-10 |
Fri 18:45 | 9 |
Fri 19:35 | 9 |
Sat 7:30 | 6 |
Sat 9:30 | 6 |
Time | FPS |
---|---|
Sat 17:00 | 9-10 |
Sat 17:45 | 8-9 |
Sat 18:45 | 8 |
Sun 7:30 | 7-8 |
Sun 9:40 | 7 |
Time | FPS |
---|---|
Sat 17:00 | 8 |
Sat 20:00 | 8 |
Mon 7:30 | 6 |
Observations
Hi @tdurand, was playing around with the current alpha ... nice!
But I noticed that currently it is not possible to switch to the wide angle lens of my android phone, which is not ideal e.g. it prevents ODC to detect things on the sidewalk if the phone is mounted on passenger seat (next to driver) as the normal/default lens is too tele. Hence currently we could only address usecases in a vehicle which "look" ahead or behind but not sideways.
Expose some settings:
later
See how those configurable settings at runtime could integrate with global OpenDataCam config
When running first time after install, everything seems OK. But any further start ends with a crash of application (App has rights for use camera, use location and use memory).
Any plans to fix it and also support Android 13, because in Play-Store it is shown as not supported in Android 13.?
regards
Peter Dressel
This started as a casual question from @b-g .. "are you gonna run node.js" on android.. which I answered.. "no no, will not, don't think it is possible..."
And yet here I am π.. Exploring the https://github.com/JaneaSystems/nodejs-mobile framework: https://code.janeasystems.com/nodejs-mobile
Turns out you can run Node.js on mobile .. I'm evaluating it on the node-mobile
branch : https://github.com/opendatacam/opendatacam-mobile/tree/node-mobile
After battling a lot to integrate it along the NCNN framework.. I think I spent 6 hours on obscure and frustrating CMake builds errors yesterday ( it is also a C++ dependency).. I got a first version running. (not OpenDataCam yet.. but nodejs)
The idea behind this is to:
Be able to reuse code from the main github.com/opendatacam/opendatacam .. To really evaluate if this will be a win for future maintenance.. The other option I am considering for now is to have all the code from node.js being ported to the client side and having next.js export a static app.. which can be done also in a way we can maintain a common part from what I see... As anyways if we run node.js on the mobile we also need to diverge from the main opendatacam code / build adapters for Mobile ( not launching darknet.. not using mongodb, etc etc )
Use the Mobile device with the opendatacam app as a "server" like we do with Jetson... This is big feature I didn't think about at first.. but if we can run the node.js server on Android.. then we can connect to OpenDataCam from another device that is on the same network and operate OpenDataCam remotely like we do now. This enables more use cases..
Be faster to deliver the app ... as the first version deadline is a bit time sensitive. This is not really a clear upside.. I think it is the same amount of work to port the node.js code to the client only as to adapt the node.js code to work on mobile...
Potential/ Confirmed Drawbacks (for now)
Adds 50 MB to the app size .. Not that problematic I think as NCNN and yolo weights are at least 50 MB already.. so the app won't be lightweight anyways..
At app install and for each subsequent update.. we need to copy the app assets so node mobile can access them: .. and from first tests it seems it is slow as we are copying all the node_module folder.. like 30s at least right now.. but I'm sure this can be optimized ..
Next.js is not really made for a node-mobile deployment.. so right now need to hardcode the path where the files are copied to start :
, that was hard to figure out and I'm not sure if this is super portable.. would need more work..More CPUs consumption.. node.js running means less power for other things.. not sure if this affects the YOLO performances..
If a Web version (client side) of OpenDataCam with neural network running via WebAssembly is something we want to do at some point.. already working on a client side only version of OpenDataCam would be a big step forward this
More complexity.. more problems.. less dependency is always best ;-)
along with JSON + CSV , in order to display location data where the things have been counted
The recent post in #81 opendatacam/opendatacam#81 of the main repo made me wonder whether there is an API on Android to get z-depth infos for pixels via the in-built cam? Maybe we could have some rough 3d "thing" in Android out of the box?
As part of the challenge, we need to have a 75 words max description that will be on the mygalileosolution website... @b-g your help with this would be nice π . need to submit by end of the week
They pre-wrote one:
OpenDataCam is an open source tool to automatically count moving objects. The AI-powered solution can process and detect objects in any live video feed. With the help of a user-friendly interface, it's possible to count objects at a certain location or visualize the trajectory on which an object moved through the frame.
But I think this can be improved.. as counting is just one of the many use case and to reflect the now multiplatform
My try at this:
OpenDataCam is an open source tool to quantify the world. It can detect, track and count objects on any video feed using AI. It is designed to be an accessible, affordable, and energy efficient solution running locally on smartphones, desktop, IoT devices without requiring your data to be send to external server.
Right now we are displaying 6 mobility classes, but ideally would be better to just display the 6 most counted classes depending on the data recorded... I'm testing on my desk and would love to see some laptop / coffee / chair OpenMojis ;-)
TODO:
Note for my future self π
I added the loading screen so now we have a much cleaner experience.. the transition is still not very very good when we are loading the localhost:8080 view (the app) once the node.js app has started.. there is a quick flicker of 1 second where we jump from the "fake" loading screen (a simple index.html file) to the "real" one served by node.
the solution for this is to entirely get rid of the "real" loading screen as when node.js has started on Android YOLO has already started so we could directly render the main UI ..
But cause of some shortcut I took in the implementation this is not super straightforward to do.. as we expect that the main UI is rendered on the client and not on the server / or on a static generation.. there is some reliance on the window
object that needs to be removed and then we could bypass it.
See some window
reliance : https://github.com/opendatacam/opendatacam/blob/master/utils/colors.js#L29
Then we should be able to set true to isListeningToYOLO
at init: https://github.com/opendatacam/opendatacam/blob/master/statemanagement/app/AppStateManagement.js#L30 , and it should render the UI : https://github.com/opendatacam/opendatacam/blob/master/components/MainPage.js#L92
In order to use nodejs-mobile , you need to first copy the node.js app files to the phone filesystem, you can't start node.js directly with the APK file. (todo when the App in first installed and anytime that the node.js app is changed (each app update) )
Right now we were using the "recommended" method from nodejs-mobile, which is to ship the app + node_module folder in the assets directory and recursively copy this to the phone file system https://code.janeasystems.com/nodejs-mobile/android/node-project-folder#copy-the-nodejs-project-at-runtime-and-start-from-there .
Problem with this is that with a big node_module
folder with quite a lot of dependency like OpenDataCam , this means copy pasting 24,723 Files, 4,709 Folders
and 142 MB
.. And on top of adding 140 MB to the app size, copying lots of small files is slow.
"Out of the box" , the first app install takes 2min30s (and each update).
Two things we are looking to improve here:
Turns out that there was a low hanging fruit here to improve this, instead of copy pasting all the files recursively which is slow, we create a zip at build time of the node.js app and then unzip it on the phone. ( idea from stackoverflow: https://stackoverflow.com/a/42415755 )
I've tried that : c1fcca4 and we get from 2min30s to around 20 seconds .. So this is already a huge win π
Only downside is that we need to integrate that zipping on the build process. for now I'm doing it manually with this command:
zip -0 -r ../nodejs-project.zip .
Thinking of having 24,723 Files, 4,709 Folders
for 142Mb
of files in the node_modules folder is a bit crazy.. when the client side part of the app (actually loaded in the browser) is max 5 MB .. the server side part has obviously node.js has a dependency but not even included in those 142 MB ( it is 142 MB + the 30-50 MB of node.js) ... My guess is that it might also be only 5 - 20 MB in the end...
Normally we never care about this because we do the npm install
on the server and the size of the node_modules folder doesn't matter.. what matters is how much you send to the client ( see explanation here: vercel/next.js#14339 ) . Here we want to also reduce this to avoid.
One tool to our service for this is npm prune --production
, which you can run after npm run build
and delete all the node_modules
dependencies that are listed as "devDependencies" in the package.json: https://github.com/opendatacam/opendatacam/blob/master/package.json#L43
Out of the box doing this already reduce a bit, down to 115 MB (121,226,417 bytes) and 22,298 Files, 4,231 Folders
What I need to do next is to improve this, by:
npm prune
(which is I think just copy/do not copy) , here is an analyse of the dependencies tree .. for example our node-moving-things-tracker
is 10 MB because it ships MOT17 benchmark files... and the node.js is really using 3 javascript files, maybe 100 kB max πNo, we won't run MongoDB on android (for now ;-) )
https://github.com/louischatriot/nedb sounds perfect , we already used it for the twitter bot.. lightweight and it was working very well ( @b-g recommended this back in the days π )
TODO:
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.