-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Body tracking API #87
Comments
Thanks for doing this. It makes sense as an analog to the hand tracking input module. Notice in the speck that the hand joints are also included in XRBodyJoint list. Is that intentional? |
Yes, that is intentional. If the user is holding a controller, they will contain emulated hand joints. |
Would it make sense to consolidate Body and hand tracking in a single API and avoid redundancy? |
I don't think so. WebXR hands is designed for input and also has the radii. |
I'm assuming this is based off the functionality in the https://registry.khronos.org/OpenXR/specs/1.0/html/xrspec.html#XR_FB_body_tracking OpenXR extension? Seems like a decent place to start, though I'm always curious in cases like this to gauge how well we think this model will extend to other devices in the future. I'm especially curious about the "wrist-twist" bones? Even after reading the OpenXR extension I wasn't quite sure what they were supposed to represent? Also, the proposed text that you wrote includes the line "The tip and wrist joints have no associated bones" and I'm wondering if you meant the wrist-twist joints or just wrist? Also, is palm included in that list with no associated bones? I'm also with Diego in wondering about the duplication between the hand tracking joints and this. I understand why it's desirable to have the hand joints be part of the skeleton queried here, so that you can get the full body pose in one go, it just feels unfortunate to have to duplicate so many symbols. Then again the hand tracking is only for a single hand so the symbols aren't given left/right designations, which seems necessary here. I am happy to see that the order of hand joints lines up between APIs, that'll probably prove useful. I would request that we try to keep the verbiage the same between both APIs, though. Currently in the hand tracking API we have, for instance, |
Other things I'd like to understand/discuss about the proposal is how it scales. Does an implementation have to be able to provide data (at least an estimate) for every joint in order to support the API? What if a system only supports upper body tracking? Is a system that estimates the full body with inverse kinematics acceptable to expose here? If so, do we need to inform the user so they can differentiate between the estimated pose and sensor-based tracking. I don't have answers for any of that, but I think it's worth coming to an agreement on before we expose an API like this. |
If expectation is that different systems can provide different / partial data that could make the case for a single "Body Tracking API" or "WebXR Body Input Module" that includes hand tracking.
We could incorporate radii? I see body as a form of input, same as hands alone. |
Yes, it's a direct conversion from what this extension returns.
I can get more info on that. I'll reach out to the team
I think that language came from the hands spec.
My main reason is that we will expose an emulated hand if the user is holding controllers. WebXR hands won't expose information in that scenario (although I guess we could change that...)
Yes, I will make those changes. |
@toji updated with a831680538590bb9ddd87b02a31584f7063ad948 |
A common feature request is to give developers access to the user's body.
I made a proposal here: https://cabanier.github.io/webxr-body-tracking/
/agenda discuss body tracking
The text was updated successfully, but these errors were encountered: