Passion for Technology
Discover the latest technology trends, meet technology enthusiasts, understand what’s behind talked about technological terms and get inspired by our passion for technology.
Passion for Technology
Understanding how gesture control works
This article readout is part of The Quintessence magazine. The latest issue explores the latest trends in technology and offers valuable insights into the fascinating world of Human Machine Interfaces. Access it free of charge here: https://library.ebv.com/link/140915/
In this episode, we examine the mechanisms behind gesture control technology, a Human Machine Interface that enables device interaction through body movements without physical contact. We discuss various detection methods, including wearables equipped with motion sensors, camera-based systems utilizing 2D and 3D imaging, infrared sensors that capture thermal radiation, and radar technology capable of detecting minute movements. Each approach offers unique advantages and challenges, contributing to the expanding role of gesture control across industries such as gaming, automotive, and healthcare.
Join us as we explore how these technologies interpret human gestures, enhancing the way we interact with machines in an increasingly touchless world.
Control via gesture
Thumbs up, waving, the open hand as a stop sign – gestures are a natural form of communication for humans. Thanks to significant advancements in sensor technology and artificial intelligence in recent years, it is now possible to control machines and devices through gestures. This development is projected to result in an average annual market growth of 20.6 percent for the gesture recognition industry, leading to an expected market size of 88.3 billion US dollars by 2031.
The breakthrough came with the introduction of Nintendo’s Wii console in 2007 and Microsoft’s Kinect motion control in 2010. Both solutions were developed for the gaming market and entertainment electronics still dominate the gesture control market today. According to market analysts from Grand View Research, the segment had a revenue share of 59.4 percent in 2022.
However, other industries are also discovering the benefits of gesture control for operating devices and machinery: for example, both the automotive industry and healthcare sector have placed great emphasis on adopting gesture recognition. This technology makes it easy and intuitive for users to interact with computers and other devices. The COVID-19 pandemic has further focused attention on gesture control, as it enables contactless and thus hygienic operation.
Various different technologies are used to detect user movements. One option is special wearables, such as bracelets or rings, equipped with motion sensors that capture the rotation rate or acceleration of the wrist. An intelligent algorithm recognises which gesture has been performed and issues the corresponding command.
Another approach is camera-based systems. In principle, 2D cameras can capture and interpret movements. However, the algorithms used have difficulty distinguishing movements in front of the screen correctly – the precise capture of distance as the third dimension is missing. For this reason, 3D cameras or image sensors are increasingly being used for gesture control. They have become more affordable in recent years and can be integrated into almost any device due to their small size. These systems complement 2D image data with depth information, mostly obtained through Time-of-Flight technology, which measures the travel time of a light pulse reflected by an object to determine the distance to the camera. Today’s image sensors can detect not only general hand movements but even the movements of each individual finger.
However, camera-based systems require adequate lighting to reliably recognise gestures. This problem does not affect infrared sensors: they detect the infrared radiation emitted by the human body (passive sensors) or emit infrared radiation themselves as active sensors and capture the reflection. The corresponding algorithms then analyse the patterns and movements of this radiation. The sensors can also generate a depth image. Thus, various different gestures can be recognised depending on predefined movement patterns and algorithms. Nevertheless, systems based on infrared sensors tend to be more suitable for simple gestures. Since they are relatively cost-effective, they are used in many industrial, consumer and automotive applications.
Unaffected by lighting conditions, resistant to contaminants, and with high resolution, radar is increasingly conquering the field of gesture control. Even the smallest movements can be detected by a radar device, with the latest systems offering a resolution of just one millimetre. Radar sensors measure the speed, direction of movement, distance and angular position in real time to detect changes in the position of objects. This makes it possible to track and depict movements of persons or specific motion patterns. And for those who associate radar with the large rotating antennas on ships – the radar sensors needed for gesture recognition fit on a microchip.
No matter which technology is used for gesture control, one challenge remains: everyone performs gestures in a different manner. This means that the systems must be able to recognise numerous interpretations of a gesture. Artificial intelligence and machine learning processes are highly useful in this regard: through complex signal evaluations, gestures can be clearly identified and classified. To process sensor data in real time and achieve the fast response times necessary for device operation, machine learning algorithms are increasingly being executed locally on the chip, close to the sensor itself – typically referred to as the edge.