Soft body locomotion. |
Computer-generated
characters have become so lifelike in appearance and movement that the
line separating reality is almost imperceptible at times. “The Matrix”
sequels messed with audiences’ perceptions of the real world (in more
ways than one) with action scenes mixing CG characters and real actors.
Almost a decade later, superheroes and blue aliens dominate the
multiplex. But while bipeds and quadrupeds have reigned supreme in CG
animation, attempts to create and control their skeleton-free cousins
using similar techniques has proved time-consuming and laborious.
Georgia
Tech researchers have found a possible solution to this challenge by
developing a way to simulate and control movement of computer-generated
characters without a skeletal structure, anything from starfish and
earthworms to an elephant’s trunk or the human tongue.
Their
modeling techniques have the potential to allow amateur animators and
even young children unparalleled control of digital creatures by simply
pointing and clicking on a screen to have them move the way they want.
One can imagine aspiring animators with tools in the near future to
build a more boisterous Bob—“Monsters vs. Aliens’” resident blob—or an
updated Ursula from “The Little Mermaid,” with her sinister tentacles
used to full effect with computer graphics.
The
researchers’ work targets simulation and control of soft body
locomotion—movement of characters without a skeletal structure—something
that is rarely explored in animation, according to Karen Liu, one of
the researchers and associate professor in the School of Interactive
Computing at Georgia Tech.
Eschewing
the traditional use of skeletons with moving joints as the basis for
animation control, the Georgia Tech research simulates soft body
computer models and controls their movement in completely new ways. Liu
and fellow researchers Jie Tan and Greg Turk will present their research
paper “Soft Body Locomotion” at SIGGRAPH 2012, the ACM international
conference on computer graphics and interactive techniques, in Los
Angeles, Aug. 5-9.
The
computer models used in the research—jello-like alphabet
letters—mimicked nature’s soft body organisms and were created using
“muscle fibers to control a volume-preserving finite element mesh.” In
short, just as a hacky sack or bean bag maintain their mass no matter
how they are squashed, the computer models followed the same principle.
The
soft body ABCs were able to perform a wide array of motions that users
decided with simple point-and-click commands. The researchers developed
algorithms that allowed “high-level goals,” which refer to specific
movements, like walking from one point to another, or jumping and then
regaining balance. Prior to this technique, in order to get soft body
characters to perform some meaningful movement, animators might attempt
thousands of computer simulation trials to get the soft body even close
to a functional motion, Liu says.
“In
this project we ‘physically simulated’ or created lifelike movements in
the soft body models that don’t require much user intervention. We’ve
built a framework where the user or the animator can just click on a
point of the soft body and direct the type of movement he or she wants.”
Jie
Tan, a Ph.D. candidate in Computer Science, took on primary animation
duties and implemented muscle types to produce different motions in the
models. Users need only to pick the muscles for their creatures, the
movement they want, and watch as the algorithm determines the muscle
force needed to fulfill the action.
Greg
Turk, professor in the School of Interactive Computing, says the
techniques could be an important part of an animator’s toolbox to create
graphics-based characters that need to be more flexible or bendable.
Central
to the research was solving how soft body characters would employ a
balancing strategy during movement. Characters with skeletal support can
use their relatively unchanging contact points with the ground to
maintain balance (feet size doesn’t change), but soft body characters
without legs might have to lengthen their bodies and slide (expanding
surface contact) or jump (breaking surface contact). The researchers
actively exploited these different types of contact strategies to
achieve control goals, including balance.
“We believe this research contribution is one that can apply broadly to other problems in animation control,” Liu says.
A video of the researchers’ soft body models in action can be found here: http://www.cc.gatech.edu/~jtan34/project/softBodyLocomotion.html
Source: Georgia Institute of Technology