uSens Brings AI Based Hand Tracking SDK For Smartphones

With new and exciting VR and AR apps making their way to fame every day, uSens has focused its efforts on bringing their product solutions to wider audiences. Hand tracking is one such idea that the company is working on. Devs will be able to integrate smartphone applications with 3D motion recognition and tracking with uSens AR.

uSens Hand Tracking SDK
Source: uSens

“This opens a whole new world of possibilities for developers, enabling them to create a truly one-of-a-kind experience for a mainstream audience.”

Anil, Co-Founder & CEO of uSens

uSense revealed its beta version at the Augmented World Expo this year and labels their Hand Tracking SDK as the push that will lead Augmented Reality to the next level. The computer vision company has made it possible by using an RGB camera in addition to deep learning and computer vision to track not just the fingertips but the complete skeletal dynamic of the hand.

Even with Google and Apple’s groundbreaking technologies, the AR content has largely been constrained by the touchscreen interfaces. What sets uSens apart is that it is working with devs and content creators so that same experiences can be enjoyed on both low and high-end devices, by simply using hand gestures in the air.

Although the performance of hand tracking may not be equally good as that with an external hardware, uSens has achieved this milestone by using machine learning. The scope of performance in multiple use-cases compensates for the comparatively smaller FoV for now.

Since the mainstream adoption of AR is still at its infant stage, any steps towards interface innovation are considered to be a significant development. The hand-tracking technology has opened up new possibilities of interaction with video games and Augmented Reality apps. Particularly, when paired with functionalities of ARCore and ARKit, it is sure to transcend the existing standards.

 

 

Facebook Notice for EU! You need to login to view and post FB Comments!