This year has seen the development of the first synthesis between the Myo armband and visualisation in the HoloLens. Using high-speed Bluetooth communication, arm movements can be used to manipulate virtual objects in Augmented Reality. By attaching a virtual object (such as a model of a hand or lightsabre) to the gyroscope and accelerometer sensors on the armband, it can be made to move around the user in a natural and believable way. The position of the centre of this rotation is based on the positional tracking of the HoloLens, effectively giving this wearable device a new appendage.
An integrated peripheral such as this is thought to have significant potential as a platform for interaction design in AR. The possible applications for interaction and use for this are as numerous and varied as the uses of our hands in everyday life. We will investigate applications that allow interaction with the AR spatial mapping as well as those that record, recognise and report on the use of a person’s hands, using multi-stream mining and classification-driven machine learning tools.
Since the armband is not equipped with a magnetometer, it suffers from yaw drift to such an extent that a hologram will end up no longer aligned with the person’s arm. The HoloLens’ hand-tracking ability was used to re-align and position the object, before giving the control back to the armband, whose resolution and sensitivity far exceeds that of the hand-tracking software tools in the HoloLens.
The fusion was explored by Will Guest, of the Performance Augmentation Lab, using Unity 3D and the Windows 10 APIs. A major hurdle overcome was the use of asynchronous tasks (built on Microsoft’s SDK) within a non-thread-safe Unity environment. The result is a natural feel and immersive representation; the lightsaber shows intuitive movement and, of course, features hyper-realistic sound effects.