Gesture recognition is a hot human-computer interaction method in recent years. Because of its cool features and convenient operation, it has been used in many fields. Especially in the field of smart cars, the development of technology and intelligence in the entire automotive industry is assisted.
At present, Audi, Mercedes-Benz, BMW, Volkswagen, Ford and other well-known OEMs have joined the gesture recognition technology in their mass production or concept car products, through different gestures to answer the phone in the car, adjust the volume, cut songs , control navigation and other functions of the operation, very interesting and novel.
Globally, uSens Ling Sen is one of the few representative companies in the field of gesture recognition, and has also made major breakthroughs in the market value of trillions of cars. From June 22nd to June 24th, we brought the uSens Lingzhi gesture recognition program to the 2018 trade fair.
Some people may ask, now that the voice interaction technology and products have a high degree of maturity, why use gesture recognition? For example, for the execution of continuous commands (adjusting the volume, adjusting the audio and video playback speed, progress, etc.), and the commands that are difficult to quantify, there are certain limitations in speech recognition. These can be easily controlled using gesture recognition. Of course, the interaction of gestures will also be subject to certain limitations, both of which have their own advantages and disadvantages. Complementation is the best state and can exert their respective advantages under special circumstances.
The core technology of uSens’ Lingzhi Car Gesture Recognition program is to track the opponent’s bones through 22 key points and 26 degrees of freedom tracking, and through deep learning algorithms to recognize the skeleton of the hand and achieve natural interaction between hands. Compared with other companies’ “hand control”, joint tracking will be more accurate and stable.
uSens LingSense currently supports the TOF Depth of Field module and the Monocular IR module in products that provide vehicle gesture interaction. The range of use that gesture recognition can support includes multiple scenes of interaction between the primary and secondary drivers and the rear passengers.
The interaction in the main driving position includes the interaction of the central control screen and the HUD, the interaction between the front passenger seat and the center control screen, and the interaction between the rear passenger and the passenger screen.
The main ways of interaction are:
• Gesture: answer the phone, adjust the volume, select songs, etc.
• 2D Dynamic Interaction: Operate 2D Graphical Interface, Entertainment, Social, Work
• 3D dynamic interaction: operating graphical interface, entertainment, social
As the team entered the field of smart cars earlier and occupied certain preemptive advantages, the company is also at the top level in the industry in terms of team and core technical strength. In terms of cost, uSens can achieve the lowest module price with the same performance and features.
At present, besides already partnering with BaTeng Automotive, uSens has been in discussions with other major manufacturers and Tier1 manufacturers and expects further cooperation.
Based on gesture recognition technology, uSens has been deployed in many areas such as smart cars, VR/AR, and smart homes. The focus of the future is on further refined products, technical polishing, combing customer needs, and expanding application scenarios.
Automotive Innovation Technology Forum
Activity Form: Summit Dialogue
Guest: Chen Jingbo, General Manager, Ling Sen Technology China