You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**HandVector** uses **Cosine Similarity** Algorithm to calculate the similarity of hand gestures in visionOS, and with a macOS tool to test hand tracking in visionOS simulator.
11
+
**HandVector** calculates the similarity between different static gestures on visionOS and comes with a macOS utility class that allows you to use gesture tracking in the visionOS simulator as well.
12
+
13
+
HandVector version 2.0 is a major update, bringing the improved **Cosine Similarity** and the **FingerShape** feature for easier customization.
14
+
15
+
> Note: HandVector 2.0 has significant API changes and is not compatible with older versions.
Your can run demo in package to see how to use it, and also can try an Vision Pro App. And also can see the App in App Store whitch uses `HandVector` to match gesture:
29
+
`HandVector 2.0` supports two kinds of gesture matching methods, which differ in their calculation principles and are suitable for different scenarios. They can also be mixed used together in a project:
27
30
28
-
1.[FingerEmoji](https://apps.apple.com/us/app/fingeremoji/id6476075901) : FingerEmoji Let your finger dance with Emoji, you can Hit the emoji card by hand with the same gesture.
31
+
***Cosine Similarity**: This method matches each joint of the specified fingers precisely, using the matrix information of each joint relative to its parent joint, resulting in high accuracy. Advantages: High precision, applicable to fingers and wrists; Disadvantages: Poor interpretability, difficult to adjust the range.
32
+
***FingerShape**: Referencing Unity's [XRHands](https://docs.unity3d.com/Packages/com.unity.xr.hands@1.5/manual/index.html) framework, this method simplifies the finger shape into five parameters: `baseCurl` (curl at the base of the finger), `tipCurl` (curl at the tip of the finger), `fullCurl` (overall curl of the finger), `pinch` (distance of pinching with the thumb), and `spread` (degree of separation from the adjacent outer finger). Advantages: The values are easy to understand and convenient to control and adjust; Disadvantages: Does not fully utilize joint pose information, thus not as precise, and is only applicable to five fingers.
29
33
30
-

34
+
### 1. Cosine Similarity Gesture Matching
35
+
`HandVector` supports matching built-in gestures as well as recording and saving custom gestures for later use. Currently, there are 8 built-in gestures: 👆✌️✋👌✊🤘🤙🫱🏿🫲🏻
2.[SkyGestures](https://apps.apple.com/us/app/skygestures/id6499123392): **[SkyGestures](https://github.com/zlinoliver/SkyGestures)** is an innovative app that uses hand gestures to control DJI Tello drones via the Vision Pro platform. It's [Open Source](https://github.com/zlinoliver/SkyGestures) now.
42
+
importHandVector
33
43
34
-

35
44
36
45
46
+
//Get current Hand info from `HandTrackingProvider` , and convert to `HVHandInfo`
37
47
38
-
### 1. Match builtin hand gesture: OK
48
+
forawait update in handTracking.anchorUpdates {
39
49
40
-

50
+
switch update.event {
41
51
42
-
`HandVector` allows you to track your hands, and calculate the similarity between your current hand to another recorded hand gesture:
let handInfo = latestHandTracking.generateHandInfo(from: anchor)
59
+
60
+
case .removed:
61
+
62
+
...
63
+
64
+
}
50
65
51
-
//update current handTracking from HandTrackingProvider
52
-
forawait update in handTracking.anchorUpdates {
53
-
switch update.event {
54
-
case .added, .updated:
55
-
let anchor = update.anchor
56
-
guard anchor.isTracked else { continue }
57
-
await latestHandTracking.updateHand(from: anchor)
58
-
case .removed:
59
-
...
60
-
}
61
66
}
62
67
63
68
64
-
//calculate the similarity
65
-
let leftScore = model.latestHandTracking.leftHandVector?.similarity(to: leftOKVector) ??0
66
-
model.leftScore=Int(abs(leftScore) *100)
67
-
let rightScore = model.latestHandTracking.rightHandVector?.similarity(to: leftOKVector) ??0
68
-
model.rightScore=Int(abs(rightScore) *100)
69
+
70
+
//Load built-in gesture from json file
71
+
72
+
let builtinHands = HVHandInfo.builtinHandInfo
73
+
74
+
//Calculate the similarity with the built-in gestures, `.fiveFingers` indicates matching only the 5 fingers, ignoring the wrist and palm.
75
+
76
+
builtinHands.forEach { (key, value) in
77
+
78
+
leftScores[key] = latestHandTracking.leftHandVector?.similarity(of: .fiveFingers, to: value)
79
+
80
+
rightScores[key] = latestHandTracking.rightHandVector?.similarity(of: .fiveFingers, to: value)
81
+
82
+
}
83
+
69
84
```
70
85
71
86
the score should be in `[-1.0,1.0]`, `1.0` means fully matched and both are left or right hands, `-1.0 `means fully matched but one is left hand, another is right hand, and `0` means not matched.
72
87
73
-
### 2. Record a new gesture and match
88
+
#### b. Record custom gesture and match it
89
+
90
+
91
+
92
+

93
+
94
+
95
+
96
+
Record a custom gesture and save it as a JSON string using `HVHandJsonModel`:
74
97
75
-

76
98
77
-
`HandVector` allows you to record your custom hands gesture, and save as JSON string:
78
99
79
100
```swift
80
-
let para = HandEmojiParameter.generateParameters(name: "both", leftHandVector: model.latestHandTracking.leftHandVector, rightHandVector: model.latestHandTracking.rightHandVector)
81
-
model.recordHand= para
82
101
83
-
jsonString = para?.toJson()
102
+
iflet left = model.latestHandTracking.leftHandVector {
103
+
104
+
let para = HVHandJsonModel.generateJsonModel(name: "YourHand", handVector: left)
105
+
106
+
jsonString = para.toJson()
107
+
108
+
//Save jsonString to disk or network
109
+
110
+
...
111
+
112
+
}
113
+
84
114
```
85
115
86
-
And then, you can turn this JSON string to `HandVectorMatcher`,so you can use it to match your gesture now:
116
+
117
+
118
+
Next, convert the saved JSON string into the `HVHandInfo` type for gesture matching:
This method draws significant reference from the well-known XR gesture framework in Unity: [XRHands](https://docs.unity3d.com/Packages/com.unity.xr.hands@1.5/manual/index.html).
159
+
160
+
161
+
162
+

163
+
164
+
165
+
166
+
The definitions of the related parameters are similar:
167
+
***baseCurl**: The degree of curl at the root joint of the finger. For the thumb, it is the `IntermediateBase` joint, and for the other fingers, it is the `Knuckle` joint, with a range of 0 to 1.
***tipCurl**:The degree of curl at the upper joint of the finger. For the thumb, it is the `IntermediateTip` joint, and for the other fingers, it is the average value of the `IntermediateBase` and `IntermediateTip` joints, with a range of 0 to 1.
The test method of`HandVector` is inspired by [VisionOS Simulator hands](https://github.com/BenLumenDigital/VisionOS-SimHands), it allow you to test hand tracking on visionOS simulator:
105
196
106
197
It uses 2 things:
107
198
108
199
1. A macOS helper app, with a bonjour service
109
-
2. A Swift class for your VisionOS project which connects to the bonjour service (already in this package, and already turn JSON data to hand gestures)
200
+
2. A Swift class for your VisionOS project which connects to the bonjour service (It comes with this package, and automatically receives and converts to the corresponding gesture; HandVector 2.0 version has updated mathematical "black magic" to achieve the new matching algorithm.)
110
201
111
202
#### macOS Helper App
112
203
@@ -131,7 +222,7 @@ To go further, take a look at the documentation and the demo project.
131
222
To integrate using Apple's Swift package manager, without Xcode integration, add the following as a dependency to your `Package.swift`:
0 commit comments