Cutting-edge software gives soldiers the training they need before heading to the real “playing field.”
Battle simulations, in some form or another, have long been practiced. Their humble, yet important beginning can be traced back to as early as 1,000 B.C. and the creation of Wei Hai—a strategy-based game invented by Art of War author Sun Tzu—which used simple colored stones to depict armies and their movements. During this time, a four-sided board game from India, Chaturanga, used a roll of the dice to determine each player’s next maneuver.
In the 1600s, the German Koenigspiel, a type of “war chess” game, introduced some of the underlying characteristics of modern military modeling and simulation (M&S) routines: a single piece signifying a unit, terrain representation, rules for movement and conflict resolution, and reliance on assumptions.
Today, these playing boards have been replaced by carefully-rendered computer generated “playing fields” with multiprocessors—not dice—now dictating the next strategic move. For its part, the U.S. government is taking stock, actively enlisting some of the premier computer generated image (CGI) software companies to produce their next wargame.
It pays to train virtually
According to Frost & Sull-ivan, total U.S. military spending on simulation and training market was $3.73 billion in 2003. Why this focus? Quite simply, M&S is a very cost-effective way to train.
“Simulation saves the Army a lot of money in fuel, ammunition, maintenance, and environmental costs,” says Michael Waldier, Project Site Manager of the U.S. Army’s Battle Projection Center, 85th Division, Arlington Heights, Il. “There is also a safety aspect. While the Army is very safety conscious and requires risk assessments to be made for all operations, including training, there is always the possibility of accidents occurring when you place large numbers of soldiers and equipment in the field.”
Another reason is the universally accepted concept that “experiential learning” is the most effective way to learn. When being taught, people theoretically retain 20% of what they hear, 40% of what they see, 80% of what they actually do—and over 90% of what they simultaneously see, hear, and do.
Additionally, the entertainment aspects of good M&S programs draw in users and keep them engaged in the learning experience. “The number of military personnel who are members of the ‘video gaming’ generation is constantly growing,” says David Versaw, CFO at WILL Interactive, Inc., Potomac, Md. “Young soldiers (and officers) grew up playing video games and watching movies. To prepare soldiers to prevail in conflict, you need to recognize and leverage their strengths.”
What’s out there?
In recent years, private companies have been teaming up with different military branches to create realistic and thorough training simulations.
With funding provided by the U.S. Army Research, Development, and Engineering Command and the Joint Advanced Distributed Learning Co-Laboratory, Orlando, Fla., the simulation products developed by Engineering & Computer Simulations, Inc. (ECS), Orlando, are built around a licensed commercial game engine. “All of the assets inside of our simulations are tagged with meta-data to allow them to be stored inside a content repository,” says Brent Smith, CTO at ECS. “These not only include models and textures, but also animation cycles and simulated behaviors for all of our characters and equipment models. This allows our military to re-use and re-purpose any assets that may be needed in future applications.”
Military, medical, and general industries are already using game-based training and simulation software from BreakAway, Ltd., Hunt Valley, Md. One of the company’s many products is NetStrike, a CGI-based intelligence fusion sim- ulation created for the Dept. of Defense, based on Breakaway’s white paper “Unified Theory for Modeling the Revolution in Military Affairs.”
Longtime military contractor, Lockheed Martin, Bethesda, Md, is also participating in the M&S arena with its NxTrain, a Next Generation Training Solution. NxTrain allows team members to recognize, interpret, and make decisions quickly and correctly in complex and dynamic situations. The training focuses on simulated learning scenarios that provide experience and performance feedback in the cognitive skills needed to perform mission operations.
“The one aspect that differentiates our simulations from all others is scale,” says Wayne Civinskas, Manager of the Advanced Simulation Centers for Lockheed Martin Simulation, Training, and Support. “Unlike most games, which tend to confine the player to a small area, our simulations represent vast, complex spaces based on real-world data. Our goal is to be able to represent multiple theaters simultaneously; ultimately representing the entire earth at a level of detail suitable for training our future military command.”
In contrast to CGI-based software is the Virtual Experience Immersive Learning Simulations (VEILS) products from WILL Interactive, which use live-action video instead of animation. The platform allows users to become the lead character(s) in interactive movies in which they face real-life situations under real-life pressures, make real-life decisions, and experience real-life consequences.
“A unique ‘slice-of-life’ context is added to create a realistic computer-based learning experience,” says WILL’s Versaw. “VEILS then branches to different pathways and outcomes based on user decisions,” similar to the choose-your-own-adventure book series.
Their offerings include Gator Six, created for the U.S. Army, which helps improve artillery battery commanders’ critical thinking, judgment, agility, adaptability, and decision-making skills, while its Level III Anti Terrorism / Force Protection, developed for the Joint Chiefs of Staff, is used to train commanders in all services.
To properly immerse trainees in a realistic simulation, detailed, timely, and accurate pictures of the battlefield are required. Many sources of information are needed to build a “picture”—archival data, roadmaps, geographical information, and databases, all of which are static. Sensor information from mobile agents at different times and locations is also required, since the scene itself varies, with people and objects constantly moving. So, time itself must also be a component in battlefield visualizations. All of these factors gives rise to the need to generate 4-D models—three dimensions determining space and the fourth giving time. Enter the Visualization MURI, a Multidisciplinary University Research Initiative (VisMURI), which aims to “develop the methodologies and tools required for the design and analysis of time-critical visualization systems, using augmented reality and 4-D dynamic models of the environment.” Researchers from the Univ. of California (UC), Berkeley, Univ. of Southern California, Los Angeles, Georgia Institute of Technology, Atlanta, Syracuse Univ., N.Y., and the Univ. of California, Santa Cruz, are jointly working toward this goal. Led by UC Berkeley’s Avideh Zakhor, a professor of electrical engineering, the team has already created a data acquisition system and post-processing software which use collected data to build fast, automated, photo-realistic 3-D models of urban environments. However, the VisMURI modeling system differs from existing approaches in a few ways. For instance, most commercial systems use airborne imagery to create 3-D models of cities, where human operators then match features between two images by clicking on two points in each of the images that correspond to the same point. But this approach is slow and not cost effective, and does not result in a precise or accurate ground-based model of the city. Modeling through the use of passive devices, such as cameras, is also plagued by the same problems as airborne matching approaches—in that it is too labor intensive to “Our system, as far as I know, is the first one that is fast in acquisition,” says Zakhor. “This is because we do ‘drive-by scanning’ and do not acquire the data in a stop-and-go fashion. Also, the processing of the acquired data is done entirely by laser scanners instead of cameras to compute 3-D depth. Furthermore, because we also take pictures for texture both from ground and airborne, our approach results in photorealistic models that you can then walk, drive, or fly through. “The biggest challenge in all of these is to register all the imagery and laser data from both ground and airborne sources together so everything ‘stitches’ together the right way. So far, we’ve managed to do this without the use of any GPS (global positioning systems), |
How’d they do that?
To create highly detailed simulations with realistic terrain and sound effects, M&S designers must turn to different tools and their own talent pool for new ways to create these effects (with some researchers even looking to 4-D distributed modeling and visualization—see sidebar on page 18).
Most M&S companies use commercial off-the-shelf (COTS) tools, such as 3D Studio Max, Maya, and SoftImage. These tools generate realistic models, environments, and animation cycles for characters and vehicles.
“To develop a realistic terrain skin, you need to use a package that is more specific to terrain generation, such as Terrex,” says ECS’s Smith. “This type of package can bring in government-supplied digital terrain elevation data or digital elevation map files to create accurate, geo-specific terrains of real-world locations.”
BreakAway’s visualization tool Trex uses satellite data to generate accurate, synthetic urban environments within 24 hr. The resulting terrain can be used to create tactical simulations for en route training that can be played on a COTS console.
Warfighter Simulation (WARSIM) from Lockheed Martin relies on “real world” data provided by their customers to create the realism needed in simulations. “The effect that is important in the simulation, such as maximum range of a weapon system, is the accurate maximum range provided by the customer,” says Ed Payne, WARSIM’s Program Director at Lockheed. The models in the simulation also take into account terrain and personnel needed to perform a task to determine the performance of a system or organization.
Programmers for military simulations—including Lockheed’s—receive support from the National Imagery and Mapping Agency or National Geo- spatial-Intelligence Agency, both of Bethesda, Md., creating realistic worldwide 3-D terrain modeling.
But it’s not really “real”
While M&S systems provide very effective training applications, it could be said that in a simulation a user cannot experience the full psychological impact of urban combat.
“I often say that we can’t teach a person how to shoot a gun, but we can teach them when to shoot it,” says ECS’s Smith. “In other words, we may not be able to realistically simulate the complexities of urban combat, but we can simulate it well enough to teach a student certain aspects of it.”
Per Waldier, the Army still most often uses live simulations—where instrumented engagement systems are combined with “live” troops and actual equipment—to train individuals and small units, except in high-cost areas, such as combat vehicle crew and air defense.
Whether it is in the classroom or “live” on the field, the U.S. government is making sure its young men and women are ready for their battles ahead—without just relying on a roll of the dice.
—Lorraine Joyce
Resources
BreakAway, Ltd., 410-683-1702, www.breakawayltd.com
Dept. of Defense’s “Models & Simulations Operation Handbook”ECS, Inc., 877-823-9991, www.ecsorl.com
Lockheed Martin, www.lmco.com
Univ. of California, Berkeley, Video and Image Processing Lab, www.video.eecs.berkeley.edu
U.S. Army’s Battle Projection CenterWILL Interactive, 301-983-6006, www.willinteractive.com