The computer interface has seen many forms, such as keyboards, mice, track balls, styluses, and touch screens. Now, virtual and augmented reality are among the new popular interfaces entering the consumer market. Another new emerging interface that likely will increase in availability and popularity is gesture control combined with ubiquitous display.
Gesture control, also called the natural user interface, is unique as it requires no user equipment other than hands and fingers to navigate with air-touch gestures such as pointing to make selections, and sweeping to turn pages. Ubiquitous display technology allows almost any surface, such as a wall, floor, ceiling or table, even textured surfaces, to become the background for a projected computer interface and air-touch interaction.
Applications for gesture control combined with ubiquitous display are virtually limitless, because they can appear anywhere a computer and a projector can be installed. Possible applications include interactive catalogs, games, presentations, maps, product designs, educational lessons, smart home controls, virtual pianos or synthesizers, and more.
iNTERPLAY, developed by Taiwan’s Industrial Technology Research Institute (ITRI) and available for licensing later this year, is one example of a gesture-control interface integrating ubiquitous display technology. One of the most interesting, fun and practical applications of iNTERPLAY, especially for those with limited living space, is a virtual piano for playing and for learning to play the piano. As shown in the below photograph, the keyboard is projected on a table or optionally on other surfaces, and the player’s key selections are instantaneously tracked, mapped to the corresponding note and rhythm, and converted to sound. The application can also project the musical score and which keys to select to play a piece. With this application, users can even bring their iNTERPLAY piano to events, and those with limited living space no longer can make the excuse they don’t have space for a piano. Other potential applications of iNTERPLAY include a smart identification and advertisement system, and a bookstore search application.
Gesture control with ubiquitous-display systems include a 3D-depth camera with a high-performance gesture-recognition algorithm combined with image projection to offer an innovative user-machine interface. Some systems support multi-touch, which distinguishes input based on how many fingers are used, such as on a computer touchpad, and adjusts its behavior and tasks accordingly. Others include object recognition and 3D-scanning functions, which enable annotations and other interactions on a physical object such as an advertisement draft page, or on a scanned virtual object such as a design prototype or a product in an online catalog.
Advanced gesture control systems feature touch/tap detection accuracy within 5 millimeters, and auto-adjust detection for thin or fat fingers, light or heavy touch, and different directions of fingers on the touch surface. The more the system can learn how people use their fingers to interact with the surface and objects, the more accurate the system is and the better for accommodating multiple users interacting with the same surface simultaneously.
Although integrated-gesture control and ubiquitous-display systems are not common yet, research institutions such as ITRI are working to bring this marriage of technologies to life and change the way users interact with information.
Te-Mei Wang joined Taiwan’s Industrial Technology Research Institute (ITRI) Electronic and Optoelectronic System Research Laboratories in 2011. She is now the manager of the 3D Interaction System Department of the Intelligent Vision System Division of ITRI. The 3D Interaction System Department’s research includes 3D gesture recognition, 3D object recognition and depth sensing for 3D interaction systems.