
Jenifer Miehlbradt demonstrates the torso strategy developed at EPFL. Credit: EPFL / Alain Herzog
The classic joystick method to pilot drones may soon be replaced by new technology that allows users to drive the unmanned aerial vehicles using just the motion of their bodies.
Ecole Polytechnique Fédérale de Lausanne (EPFL) researchers have developed a system using wearable markers that allows users to pilot a drone with movements of their torso, which could open the door to a number of new applications for piloting drones or even planes of the future.
“Our aim was to design a control method which would be easy to learn and therefore require less mental focus from the users so that they can focus on more important issues, like search and rescue,” lead author Jenifer Miehlbradt of EPFL’s Translational Neuroengineering Laboratory, said in a statement. “Using your torso really gives you the feeling that you are actually flying. Joysticks, on the other hand, are of simple design but mastering their use to precisely control distant objects can be challenging.”
The researchers first began to monitor the body movements of 17 individuals with 19 markers placed on their upper bodies, as well as their muscular activity. Each volunteer followed the actions of a virtual drone through simulated landscapes that passed by as viewed through virtual reality goggles.
The scientists discovered various motion patterns and quickly established torso-related strategies for piloting drones. The team found that only four markers—all located on the torso—were needed to pilot flight simulators and real drones through a circuit of obstacles effectively.
Overall, the scientists compared their torso strategies to joystick control in 39 individuals and found that torso drone control outperformed joystick control in precision and reliability with minimal training sessions.
“Data analysis allowed us to develop a very simple and intuitive approach which could also be used with other populations, machines, and operations,” Bertarelli Foundation Chair Silvestro Micera, said in a statement. “The approach significantly improves the teleoperation of robots with non-human mechanical attributes.”
Most current robotic control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface.
These actions can prove to be challenging to operator.
“The accurate teleoperation of robotic devices requires simple, yet intuitive and reliable control interfaces,” the study states. “However, current human–machine interfaces [HMIs] often fail to fulfill these characteristics, leading to systems requiring an intensive practice to reach a sufficient operation expertise.
“Here, we present a systematic methodology to identify the spontaneous gesture-based interaction strategies of naive individuals with a distant device, and to exploit this information to develop a data-driven body–machine interface [BoMI] to efficiently control this device. We applied this approach to the specific case of drone steering and derived a simple control method relying on upper-body motion.”
The researcher’s next plan to make the torso strategy completely wearable for piloting flying objects, replacing the body markers and external motion detectors currently needed for the technology to be operational.
The study was published in the Proceedings of the National Academy of Sciences of the United States of America.