Research & Development World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE

New Interface Helps to Effectively Operate Robots

By Kenny Walter | April 26, 2017

A new, simpler and efficient interface to control robots will allow laymen to do it without significant training time.

Traditionally, robots are controlled by a computer screen and a mouse in a ring-and-arrow system that is often difficult to control and error-prone. However, in the new system, designed by researchers from Georgia Institute of Technology, the user can point and click on an item and choose a specific grasp to allow the robot to get into the position to grab the chosen item.

“Instead of a series of rotations, lowering and raising arrows, adjusting the grip and guessing the correct depth of field, we’ve shortened the process to just two clicks,” Sonia Chernova, the Georgia Tech assistant professor in robotics and advisor to the research team, said in a statement.

The researchers used college students to test both the traditional method and the new point-and-click program and found that the new system resulted in significantly fewer errors and allowed participants to perform tasks quicker and more reliably. Each student used both methods for two minutes and averaged only one mistake per task in the point-and-click method compared to almost four mistakes per task using the traditional method.

“Roboticists design machines for specific tasks, then often turn them over to people who know less about how to control them,” David Kent, the Georgia Tech Ph.D. robotics student who led the project, said in a statement. “Most people would have a hard time turning virtual dials if they needed a robot to grab their medicine. But pointing and clicking on the bottle? That’s much easier.”

The traditional method is cumbersome because it requires two screens so the user can adjust the virtual gripper and command the robot exactly where to go and what to grab.  While giving the user a maximum level of control and flexibility, the size of the workspace can be burdensome and contribute to an increase number of errors.

However, because the new method doesn’t include 3D mapping, it only provides a single camera view and allows the robot’s perception algorithm to analyze an object’s 3D surface geometry to determine where the gripper should be placed.

“The robot can analyze the geometry of shapes, including making assumptions about small regions where the camera can’t see, such as the back of a bottle,” Chernova said. “Our brains do this on their own — we correctly predict that the back of a bottle cap is as round as what we can see in the front.

“In this work, we are leveraging the robot’s ability to do the same thing to make it possible to simply tell the robot which object you want to be picked up,” she added.  

 

Related Articles Read More >

NASA taps 100 million satellite images to train an open geospatial foundation model
Why Google DeepMind’s AlphaEvolve incremental math and server wins could signal future R&D payoffs
2025 R&D layoffs tracker tops 92,000
Is your factory (or lab) ready to think? An insider’s take on next-gen automation and what really works
rd newsletter
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, trends, and strategies in Research & Development.
RD 25 Power Index

R&D World Digital Issues

Fall 2024 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R&D magazine today.

Research & Development World
  • Subscribe to R&D World Magazine
  • Enews Sign Up
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • Global Funding Forecast

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE