GithubHelp home page GithubHelp logo

Kinect v1 Tracking about open_ptrack HOT 14 CLOSED

openptrack avatar openptrack commented on August 16, 2024
Kinect v1 Tracking

from open_ptrack.

Comments (14)

nanodust avatar nanodust commented on August 16, 2024

what does

echo $KINECT_DRIVER

say ?

if openni, before running OPT, make sure to

source ~/.bashrc

if that's not it... if it is at all possible to start on a clean installation of 14.04 w/o anything previously installed, and then follow the installation steps,
it should work just fine.

from open_ptrack.

mmunaro avatar mmunaro commented on August 16, 2024

Check also this issue, related to problems with the OpenNI driver:
#19

from open_ptrack.

dolsem avatar dolsem commented on August 16, 2024

It gave me freenect...
After a lot of pain I finally decided to reboot my pc and it worked! Perhaps the old assignment was stuck somewhere in the memory.
Now I'm on to the next step, trying to decipher Rviz output to make sure everything works before reading JSON data. When I choose either detection or tracking history it says N number of messages was received (where N > 0), but the 3D visualization window remains blank. It might be that I'm new to ROS and just haven't configured it as needed, but another thing is, when I look at the detection window I don't see any squares around humans, just the raw RGB output. Is this normal?

from open_ptrack.

dolsem avatar dolsem commented on August 16, 2024

Also, when I run it I get this in my console:

Ground plane initialization starting...
Automatic mode for ground plane estimation.
Ground plane coefficients: 0.0716385, 0.987616, 0.139582, -1.48269.

[ERROR] [1442550811.655085015]: transform exception: "world" passed to lookupTransform argument target_frame does not exist.

Does the "automatic mode" mean that my calibration failed?

And what does the second error mean?

from open_ptrack.

mmunaro avatar mmunaro commented on August 16, 2024

The ground plane estimation seems to work for you and it says that your camera is 1.48 meters above the ground (last ground plane coefficient).
The [ERROR] you get is normal at startup, it happens only once because not all transforms between reference frames are already set.
The fact that you see no rectangles around humans in the image and no tracks in RViz is not normal.

Does the "automatic mode" mean that my calibration failed?

The "automatic mode" for ground plane estimation is set in the yaml configuration file for people detection. It does not mean that the calibration failed.

Which launch file are you running for tracking?
Are you following this guide?
https://github.com/OpenPTrack/open_ptrack/wiki

from open_ptrack.

dolsem avatar dolsem commented on August 16, 2024

Yes, I've been following that guide step by step. The launch file I am using is detection_and_tracking.launch in the tracking directory. But there is one additional step I had to make. When I first tried to launch the launch file ROS told me it couldn't find it. So I went to the ROS directory and saw that there were no files mentioned in the compiling script output whatsoever. I then found the launch files in ~/workspace/ros/catkin/source, so I just copied ~/workspace/ros/catkin/source/tracking into /opt/ros/indigo/share along with the other folders I found. After that ROS stopped complaining.

from open_ptrack.

mmunaro avatar mmunaro commented on August 16, 2024

From your directories, it seems you did not install OpenPTrack with the installing scripts provided.
You should have a directory tree like this:
~/workspace
-> ros
-> catkin
-> src (or source in your case)
-> open_ptrack
-> detection
-> opt_calibration
-> tracking
-> etc.

I think this is the reason why ROS cannot find OpenPTrack launch files.

Anyway, the launch file you are launching is to perform tracking with one Kinect v1 only, without using more than one sensor and without taking into account any extrinsic calibration of the camera network.

from open_ptrack.

mmunaro avatar mmunaro commented on August 16, 2024

It is not showing the indentation, this is what I meant:
~/workspace
----> ros
--------> catkin
------------> src (or source in your case)
----------------> open_ptrack
--------------------> detection
--------------------> opt_calibration
--------------------> tracking
--------------------> etc.

from open_ptrack.

dolsem avatar dolsem commented on August 16, 2024

I'm sorry, I totally messed up writing the directory paths... So yes, I had exactly the same tree that you showed, but it didn't work. I did install it using the scripts. So I actually moved the folders from ~/workspace/ros/catkin/src/open_ptrack/* to /opt/ros/indigo/share, and that's when it found it.
And yes, I'm using a single Kinect, connected to my main computer, running the program from the same computer.

from open_ptrack.

dolsem avatar dolsem commented on August 16, 2024

Okay guys, so I thought about what you said about the camera being 1.4 m above the ground, when it actually was more like 1.8, and I recalibrated it. Now it's tracking me just fine, so that's where the problem lied. Thanks for all the help!
Here's another thing - is it supposed to also track... a fan? I have a small black fan, 20 cm in diameter, positioned about 5 m from the camera, and it's tracking it as well for some reason.

from open_ptrack.

dolsem avatar dolsem commented on August 16, 2024

One final request. The wiki is quite vague on the topic of how json data is delivered. From what I understand the program records the whole tracking history in JSON, but that means that this data gets old with every new frame. Does this mean that there is new JSON data every fraction of a second? Or is it updated, say, once a second? I'm trying to understand how to make my program respond to immediate changes in people's positions. So could you post another page there explaining how it works and what the structure of its output exactly is?

from open_ptrack.

nanodust avatar nanodust commented on August 16, 2024

Glad you got it working. it can be a bit sensitive out of the box.
You can now tune parameters for your environment. Background subtraction, haars thresholds...
in this way you can track people, and only people :)

the primary parameters are described in https://github.com/OpenPTrack/open_ptrack/wiki/Imager-Settings

rosrun rqt_reconfigure rqt_reconfigure allows you to make adjustments to them in realtime to see what's best, before (manually) saving the new params to the file.

from open_ptrack.

nanodust avatar nanodust commented on August 16, 2024

From what I understand the program records the whole tracking history in JSON,
no - it emits a new JSON object for every detection.

if you want the entire history, you'll have to consume and store that history.

Does this mean that there is new JSON data every fraction of a second?

yes

So could you post another page there explaining how it works and what the structure of its output exactly is?

the structure of the json message is given here

along with many examples of consumers in various languages.

what language are you using to consume ?

from open_ptrack.

nanodust avatar nanodust commented on August 16, 2024

as you have Kinect 1 detection working, I'm closing this issue. Please open a new one if you need more help w/ UDP. meanwhile, do try some of the examples first, to get a feel for it !

from open_ptrack.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.