An iOS augmented reality project using the Leap Motion Controller on an iPhone 6 to move virtual furniture around through hand gestures while using a smartphone compatible head-mounted-device. Check Readme for more info (in German).
Instead of doing a collision with hand and furniture, another idea that includes raycasting as a way to manipulate objects further away is made. By outlining furniture that the user targets with raycasting from 1 hand, it becomes visibly clear what the user is targeting. While the ray collides with the furniture and the user performs and holds the grab gesture, the furniture is attached to the point of collision, making the furniture move with the ray in relation to the distance of the hand.
make raycast from hand
make outline react to raycast collision
make furniture lerp smoothly to initial point of collision on the ray
The user should be able to scale the size of the targeted object with the use of a specific gesture. The suggested gesture for this is a 2 handed grab gesture. Similar to how the user is now able to change the distance of a target object (see ticket #15 ), the hand distance is used to determine the scale of the object, i.e. the nearer the hands are when performing the gesture, the smaller the object becomes and vice-versa.
Es soll untersucht werden, wie die Kommunikation mit LeapMotion stattfinden könnte zwischen Win/Mac PC und Smartphone. Eine Möglichkeit wäre die von Coloreality vorgeschlagen. Vielleicht klappt das mit NodeJS
Nachdem ein VuMarker von der App erkannt wurde, sollte man das dargestellte Möbel-Objekt mit einem Grab Gesture nehmen können, um es im Raum zu positionieren.
The user should be able to rotate the targeted object with a specific gesture. The gesture suggested for this is a 2 handed pinch gesture. When performing these, the hand position of both hands are targeted and depending on where they are in 3D space to each other, the targeted object will change its rotation in relation to the position of the hands.
In Unity kann man zwar ein stereoskopischer Display leicht implementieren, aber in Verbindung mit Vuforia ist das noch mit iOS relativ schwierig. Vielleicht klappt das besser mit Android? Ich arbeite gerade dieses Tutorial durch: (Link)
To be able to move furniture away from or closer to the user (z-axis) another gesture is required. This could be done while positioning the furniture with 1 hand and the other can manage the distance of the furniture. By performing 2 grab gestures at the same time the distance should be able to be manipulated. Bringing both grab gestures nearer to each other should decrease the distance while the opposite may also take place.
add additional grab gesture for manipulating the distance of target furniture
calculate distance in relation to distance of both grab gestures
Wird wohl nichts. Hardware technisch haben wir das nicht so druff, weil wir entweder einer der neueren Smartphone Versionen haben müssen oder ein externes Gerät für die Tiefenwahrnehmung. Momentan würde nur die Rotation funktionieren