Work with hands using ARCore for Jetpack XR

ARCore for Jetpack XR can provide information about the user's detected hands, and gives pose information for hands and their associated joints. This hand data can be used to attach entities and models to a user's hands, for example, a tool menu:

Obtain a session

Access hand information through an Android XR Session. See Understand a Session's lifecycle to obtain a Session.

Configure the session

Hand tracking is not enabled by default on XR sessions. To receive hand data, configure the session:

val newConfig = session.config.copy(
    handTracking = Config.HandTrackingMode.Enabled
)
when (val result = session.configure(newConfig)) {
    is SessionConfigureConfigurationNotSupported ->
        TODO(/* Some combinations of configurations are not valid. Handle this failure case. */)
    is SessionConfigurePermissionsNotGranted ->
        TODO(/* The required permissions in result.permissions have not been granted. */)
    is SessionConfigureSuccess -> TODO(/* Success! */)
}

Retrieve hand data

Hand data is available for left and right hands separately. Use each hand's state to access pose positions for each joint:

Hand.left(session)?.state?.collect { handState -> // or Hand.right(session)
    // Hand state has been updated.
    // Use the state of hand joints to update an entity's position.
    renderPlanetAtHandPalm(handState)
}

Hands have the following properties:

  • isActive: whether or not the hand is being tracked.
  • handJoints: a map of hand joints to poses. Hand joint poses are specified by the OpenXR standards.

Use hand data in your app

The positions of a user's hand joints can be used to anchor 3D objects to a user's hands, for example, to attach a model to the left palm:

val palmPose = leftHandState.handJoints[HandJointType.PALM] ?: return

// the down direction points in the same direction as the palm
val angle = Vector3.angleBetween(palmPose.rotation * Vector3.Down, Vector3.Up)
palmEntity.setHidden(angle > Math.toRadians(40.0))

val transformedPose =
    session.scene.perceptionSpace.transformPoseTo(
        palmPose,
        session.scene.activitySpace,
    )
val newPosition = transformedPose.translation + transformedPose.down * 0.05f
palmEntity.setPose(Pose(newPosition, transformedPose.rotation))

Or to attach a model to your right hand's index finger tip:

val tipPose = rightHandState.handJoints[HandJointType.INDEX_TIP] ?: return

// the forward direction points towards the finger tip.
val angle = Vector3.angleBetween(tipPose.rotation * Vector3.Forward, Vector3.Up)
indexFingerEntity.setHidden(angle > Math.toRadians(40.0))

val transformedPose =
    session.scene.perceptionSpace.transformPoseTo(
        tipPose,
        session.scene.activitySpace,
    )
val position = transformedPose.translation + transformedPose.forward * 0.03f
val rotation = Quaternion.fromLookTowards(transformedPose.up, Vector3.Up)
indexFingerEntity.setPose(Pose(position, rotation))

Detect basic hand gestures

Use the poses of joints in the hand to detect basic hand gestures. Consult the Conventions of hand joints to determine which range of poses the joints should be in to register as a given pose.

For example, to detect a pinch with the thumb and the index finger, use the distance between the two tip joints:

val thumbTip = handState.handJoints[HandJointType.THUMB_TIP] ?: return false
val thumbTipPose = session.scene.perceptionSpace.transformPoseTo(thumbTip, session.scene.activitySpace)
val indexTip = handState.handJoints[HandJointType.INDEX_TIP] ?: return false
val indexTipPose = session.scene.perceptionSpace.transformPoseTo(indexTip, session.scene.activitySpace)
return Vector3.distance(thumbTipPose.translation, indexTipPose.translation) < 0.05

An example of a more complicated gesture is the "stop" gesture. In this gesture, each finger should be outstretched, that is, each joint in each finger should roughly be pointing in the same direction:

val threshold = toRadians(angleInDegrees = 30f)
fun pointingInSameDirection(joint1: HandJointType, joint2: HandJointType): Boolean {
    val forward1 = handState.handJoints[joint1]?.forward ?: return false
    val forward2 = handState.handJoints[joint2]?.forward ?: return false
    return Vector3.angleBetween(forward1, forward2) < threshold
}
return pointingInSameDirection(HandJointType.INDEX_PROXIMAL, HandJointType.INDEX_TIP) &&
    pointingInSameDirection(HandJointType.MIDDLE_PROXIMAL, HandJointType.MIDDLE_TIP) &&
    pointingInSameDirection(HandJointType.RING_PROXIMAL, HandJointType.RING_TIP)

Keep the following points in mind when developing custom detection for hand gestures:

  • Users may have a different interpretation of any given gesture. For example, some may consider a "stop" gesture to have the fingers splayed out, while others may find it more intuitive to have the fingers close together.
  • Some gestures may be uncomfortable to maintain. Use intuitive gestures that don't strain a user's hands.

Determine the user's secondary hand

The Android system places system navigation on the user's primary hand, as specified by the user in system preferences. Use the secondary hand for your custom gestures to avoid conflicts with system navigation gestures:

val handedness = Hand.getHandedness(activity.contentResolver)
val secondaryHand = if (handedness == Hand.Handedness.LEFT) Hand.right(session) else Hand.left(session)
val handState = secondaryHand?.state ?: return
detectGesture(handState)