ultraleap / openxrhandtracking Goto Github PK
View Code? Open in Web Editor NEWOpenXR API layer enabling XR_EXT_hand_tracking support using Ultraleap tracking
Home Page: https://www.ultraleap.com
License: Other
OpenXR API layer enabling XR_EXT_hand_tracking support using Ultraleap tracking
Home Page: https://www.ultraleap.com
License: Other
I'm trying to adjust the tilts, but there is no instructions on where to put the environment variable.
I know I need to use the code: ULTRALEAP_OPENXR_TILT_ANGLE
10 but I don't know where to put it.
The API layer does not currently report the linear or angular velocity of the hands. If requested, the XrHandJointVelocitiesEXT
structure will be returned with the validity bits unset.
This is currently an optional part of the OpenXR XR_EXT_hand_tracking
specification, but should be supported in a future release.
XrSystemHandTrackingPropertiesEXT.supportsHandTracking
will always return XR_TRUE
when this API layer is enabled, regardless of if a device is connected. This is the correct behaviour as-per the OpenXR specification, but may not be desirable. An application may enable a hand-tracking interface based on this flag, but it may be unusable without a tracker, leading to a poor user experience.
OpenXR currently has no way to indicate a hot-pluggable device, with the closest being XrHandJointLocationsEXT.isActive
which indicates if hand-tracking information is currently available for the requested hand-tracker. However this only indicates true if a hand is actively being tracked.
The same radius is reported for all the joints in the hand. This can lead to odd-looking visualisations of the hand if these radii are used to size finger proxy objects.
This is currently a limitation of the underlying LeapC library which returns the same value for all joints. Possible solutions to this include:
This API layer currently does not disable its function intercepts if the XR_EXT_hand_tracking
extension is not requested by the application. This should have no practical effect, as it is undefined behaviour for an application to use these functions in this scenario.
However, this issue can lead to OpenXR conformance test failures when this layer is enabled and should be resolved.
Hi, this isn't specific to leap motion's implementation but I don't know where else would be best to ask this. I just wanted to get some clarification of which coordinate space are the joint poses outputted in for the hand tracking extension, are they all in world space or local space relative to each joint's parent?
From what I've observed and looking at a Microsoft sample they seem to be all in world space but all relative to the palm joint?
When I call xrLocateHandJointsEXT I'm using my app space, space location.
After getting Win32/WMR working just fine with your API layer, I tested the same code with UWP/WMR! Not working right away, looks like immediately after xrCreateInstance succeeds, these messages repeat constantly:
Exception thrown at 0x00007FF987D7A799 in StereoKitTest_UWP.exe: Microsoft C++ exception: std::runtime_error at memory location 0x0000000583CFFA50.
Exception thrown at 0x00007FF987D7A799 in StereoKitTest_UWP.exe: Microsoft C++ exception: Ultraleap::OpenXR::UltraleapTrackingError at memory location 0x0000000583CFFDE0.
I then get an error from xrCreateHandTrackerEXT, an XR_FEATURE_ERROR_UNSUPPORTED. The application runs fine after that but continues to spam the error, and hand tracking doesn't work.
I am running OpenXR Runtime to play MSFS and DCS.
Will this work for allowing hand tracking with a Leap Motion device on my HP Reverb G2? So far I have been able to get this to work on MSFS under SteamVR but that is not ideal.
The user's virtual hands may appear to move relative to the head when the head is moved quickly, even when the user's hands are remaining still. This is due to the temporal warping settings and the fact that the hand position and view position are updated at different rates.
To account for this, there are two settings exposed by environment variables ULTRALEAP_OPENXR_TIME_WARP_HEAD
and ULTRALEAP_OPENXR_TIME_WARP_VIEW
which control the adjustment to the timestamps used to locate the hands and view respectively.
Sensible defaults have been chosen for each of these, but since these values can differ between headsets and OpenXR runtimes, per headset tuning may be required for optimal results.
Currently the value is set as an average of empirically found timings for the following headsets:
An enhancement would be to include a lookup table based on the current runtime or headset to allow further fine-tuning.
Enabling the API layer with Unreal Engine 4.25 leads to a number of difficult to diagnose errors. This is due to Unreal 4.25 shipping with version 1.0.0 of the OpenXR loader which has a known issue (KhronosGroup/OpenXR-SDK-Source#91) which prevents OpenXR API layers from functioning correctly. The nature of this bug causing function pointers to have the wrong addresses leads to a number of different error messages which are not indicative of the underlying cause.
This can be demonstrated to be the underlaying causing by replacing the shipped OpenXR Loader DLLs with newer versions (1.0.1
or greater).
This will be permanently resolved when Unreal Engine ships with a newer OpenXR loader, and should be closed when this is verified.
I just checked this with both v4 and the v5 preview installed, the wrist joint appears to be showing up somewhere well down the forearm instead of at the wrist. I'd definitely love to have this forearm information, but perhaps in some other way!
It's sort of quiet, a year already.
Thanks for the great product
The installation stops at step "Log Permissions" after a translation exception.
Installing:
- OpenXR Directory: DONE
- API Layer DLLs: DONE
- Documentation: DONE
- Registry keys: DONE
- Uninstaller: DONE
- DLL Permissions: DONE
- Log Permissions: FAILED: Exception calling "Translate" with "1" argument(s): "Some or all identity references could not be translated."
There is a known issue (https://steamcommunity.com/app/250820/discussions/8/2523653167130760453/) with the SteamVR OpenXR runtime 1.13.9 not correctly honouring the XrSystemHandTrackingPropertiesEXT
structure as an extension to the xrGetSystemProperties
call. The next
pointer on that call is nullified on return, where as it should not be modified.
This API Layer includes a work-around that detects this issue, and corrects this behaviour in an application transparent way. This work-around should be removed in future when the SteamVR versions that exhibit this behaviour are obsoleted.
Hi there. Just a quick question from me. Hope that's OK.
When using this in conjunction with MSFS and the OpenXR Toolkit I find that the movements are extremely jittery and sensitive. Is there any way to apply/increase smoothing on the output from this layer prior to it being handed over to the OXR Toolkit layer?
I understand that this will trade off some accuracy/responsiveness, of course.
No worries if not and thanks for this great software :)
as there's no linux build I wonder if you are working on it?
would be sad if it would die like the orion linux support
OpenXR developer tool for windows MR is recommended to check installation of API layer (using version 109.2111.23003.0 on Windows 11) . However, it reports unkown OpenXR runtime, runtime manifest C:\Windows\system32\MixedRealityRuntime.json, exception XrResult failure {XR_ERROR_RUNTIME_FAILURE], origin: xrCreateInstance(&createInfo, instance.Put()).
I just went through the setup steps, but ran into a bit of trouble! The API layer fails to load, and I'm not certain .
I'm running WMR and OpenXR Loader 1.0.9. xrEnumerateApiLayerProperties
shows the XR_APILAYER_ULTRALEAP_hand_tracking
layer as present within my project, and the WMR OpenXR Dev Tools also report the layer as present. But starting an instance with it in the layer list generates an error.
Here's the error:
Error [GENERAL | xrCreateInstance | OpenXR-Loader] : ApiLayerInterface::LoadApiLayers - failed to find layer XR_APILAYER_ULTRALEAP_hand_tracking
Error [GENERAL | xrCreateInstance | OpenXR-Loader] : Failed loading layer information
Any ideas? I'm still digging, but I figured I'd make note of it here.
Code for reference:
https://github.com/maluoi/StereoKit/blob/develop/StereoKitC/systems/platform/openxr.cpp#L100
Specific commit that adds API layers:
StereoKit/StereoKit@f42e48a
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.