Recently, the “control rain” and “control flower” on the vibrato have become the “new darling” of many young ladies and sisters, and they can achieve large-scale special effects with a single stroke. In the photo and video, to achieve such a simple and interesting effect, you have to mention the AR technology behind it. A few days ago, uSens Technology added gesture recognition to more mobile AR solutions, making mobile AR games more likely.
In the official video released by uSens, a blue butterfly fluttered at the desk, reaching out to interact with the butterfly, the butterfly flew with the action of the hand, then the butterfly disappeared, and the palm of the hand appeared a flame, manually, and the flame followed. Manually. In the video, the hands are real, and the butterflies and flames are virtual animations designed by computers. Due to the accurate positioning, the fusion degree of the hand and the virtual object is high, giving a real feeling.
Previously, uSens has been focusing on the development of 3D human-computer interaction technology, and gesture recognition is its long term. The Fingo Gesture Interaction Module was released in 2016. The base product version has two infrared cameras for 26-DOF gesture tracking and marker-based position tracking. When embedded in the VR head display, the user’s hands become a mouse, which can be clicked, dragged, etc. In 2017, uSens released a new version of the Fingo SDK, which was upgraded from the original one-handed operation to two-handed, and the user can perform two-hand operation.
Gesture recognition technology has reached cooperation with many VR/AR head display manufacturers, and gradually lays out multiple sub-areas such as smart home and vehicle gesture interaction. Now, mobile AR is also the focus of uSens Lingzhi technology.
In June 2017, WWDC announced the launch of the ARkit development platform to support a variety of Apple mobile phones, achieving a major breakthrough in mobile phone functions. Subsequently, in August 2017, Google also launched the AR development tool ARCore, and canceled the Tango project, focusing on the popularity of ARCore on the Android system. At the same time, Huawei recently announced the self-developed AR Engine, and OPPO also began to add AR functions to mobile phones. However, due to the realization of the AR effect, there are certain requirements for the performance of the mobile phone. Apple ARKit only targets the iOS system. Google also chooses some high-end mobile phones to cooperate with ARcore. Huawei is targeting its own mobile phone.
Last year, in addition to the upgrade of gesture recognition technology, uSens saw the application prospect of SLAM technology in the mobile phone field. They released the mobile-side inside-out location tracking solution, which is based on the SLAM algorithm and can support low-end mobile phones ( iOS and Android support at the same time) to achieve better AR effects, so that AR is no longer limited to high-end mobile phones. At the CES show in the United States in 2018, uSens Ling announced the release of the AR camera application for the Spreadtrum Android thousand yuan platform.
Why does uSens feel this effect? uSens Ling chose the path of algorithm middleware. More efficient algorithms and greater versatility are essential for mobile and mobile SoC (System-on-Chip, core computing chips for smartphones and smart hardware) customers. Algorithm versatility is also a very important part of uSens. Apple’s ARKit relies on its powerful GPU and the AI coprocessor in the latest models. Google also needs the top Hexagon DSP of Qualcomm’s flagship platform or even its own designed VPC ( Visual Process Core), while uSens’s algorithm only needs to run on the CPU, mainstream CPUs and hardware products and SoCs that support camera and IMU synchronization (the vast majority of mobile phones currently on the market). For the small and medium chip manufacturers, their own research and development of SLAM and gesture interaction features a large investment, long cycle, and unstable returns. For these reasons, uSens Ling has become a partner between Intel and Spreadtrum. In the latter, the SC9853I mobile chip for the entry-level mobile phone market directly integrates uSens Ling AR, allowing users and developers to directly use ARKit/ARCore. Features.
Now, they have added gesture recognition to the mobile AR solution. Imagine that when you take photos with your favorite cartoon characters, you are not just standing side by side, you can also reach out and interact with him, which greatly increases the fun of taking pictures.
Of course, taking photos and recording videos on mobile phones is only part of the application of AR on mobile phones. Imagine that in education, students and parents only need to use a mobile device to align a picture, and the 3D image will appear on the screen of the device, making the learning process more interesting and interesting. On the way of travel, when the tourists pull out the mobile phone, they can not only navigate in real time. There are virtual cartoon characters, and the introduction of various stores on the screen of the mobile phone. The digital information can guide the tourists quickly and efficiently. When shopping, for some homes with uncertain sizes, AR can also help customers to size. measuring. In the field of route navigation, games, etc., AR brings us into the three-dimensional world through mobile phones.
From early gesture recognition, to the development of mobile SLAM technology, to the combination of the two, uSens Ling constantly explores the technology and brings a better experience. In 2018, uSens will complete the strategic transformation from a technology company to a product company, and launch a floor product for each segment.