Research & Development World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE

Military Apps Drive Simulation Tools

By R&D Editors | June 15, 2005

Note: The Tactical Combat Casualty Care simulator is currently being developed by ECS for the Army’s combat medics, focusing on treating wounded soldiers during urban combat. This program enables students deficient in certain areas of live training to further practice those areas.

Battle simulations, in some form or another, have long been practiced. Their humble, yet important beginning can be traced back to as early as 1,000 B.C. and the creation of Wei Hai—a strategy-based game invented by Art of War author Sun Tzu—which used simple colored stones to depict armies and their movements. During this time, a four-sided board game from India, Chaturanga, used a roll of the dice to determine each player’s next maneuver.

In the 1600s, the German Koenigspiel, a type of “war chess” game, introduced some of the underlying characteristics of modern military modeling and simulation (M&S) routines: a single piece signifying a unit, terrain representation, rules for movement and conflict resolution, and reliance on assumptions.

Today, these playing boards have been replaced by carefully-rendered computer generated “playing fields” with multiprocessors—not dice—now dictating the next strategic move. For its part, the U.S. government is taking stock, actively enlisting some of the premier computer generated image (CGI) software companies to produce their next wargame.

It pays to train virtually

According to Frost & Sullivan, total U.S. military spending on simulation and training market was $3.73 billion in 2003. Why this focus? Quite simply, M&S is a very cost-effective way to train.

Note: In WILL Interactive’s Gator Six – Battery Command Virtual Experience, the trainee becomes a U.S. Army Battery Commander deployed to an East Asian hot spot. Critical decisions that determine the user’s fate and the fate of soldiers under his/her command are required.

“Simulation saves the Army a lot of money in fuel, ammunition, maintenance, and environmental costs,” says Michael Waldier, Project Site Manager of the U.S. Army’s Battle Projection Center, 85th Division, Arlington Heights, Il. “There is also a safety aspect. While the Army is very safety conscious and requires risk assessments to be made for all operations, including training, there is always the possibility of accidents occurring when you place large numbers of soldiers and equipment in the field.”

Another reason is the universally accepted concept that “experiential learning” is the most effective way to learn. When being taught, people theoretically retain 20% of what they hear, 40% of what they see, 80% of what they actually do—and over 90% of what they simultaneously see, hear, and do.

Additionally, the entertainment aspects of good M&S programs draw in users and keep them engaged in the learning experience. “The number of military personnel who are members of the ‘video gaming’ generation is constantly growing,” says David Versaw, CFO at WILL Interactive, Inc., Potomac, Md. “Young soldiers (and officers) grew up playing video games and watching movies. To prepare soldiers to prevail in conflict, you need to recognize and leverage their strengths.”

What’s out there?

In recent years, private companies have been teaming up with different military branches to create realistic and thorough training simulations.

With funding provided by the U.S. Army Research, Development, and Engineering Command and the Joint Advanced Distributed Learning Co-Laboratory, Orlando, Fla., the simulation products developed by Engineering & Computer Simulations, Inc. (ECS), Orlando, are built around a licensed commercial game engine. “All of the assets inside of our simulations are tagged with meta-data to allow them to be stored inside a content repository,” says Brent Smith, CTO at ECS. “These not only include models and textures, but also animation cycles and simulated behaviors for all of our characters and equipment models. This allows our military to re-use and re-purpose any assets that may be needed in future applications.”

Military, medical, and general industries are already using game-based training and simulation software from BreakAway, Ltd., Hunt Valley, Md. One of the company’s many products is NetStrike, a CGI-based intelligence fusion simulation created for the Dept. of Defense, based on Breakaway’s white paper “Unified Theory for Modeling the Revolution in Military Affairs.”

Longtime military contractor, Lockheed Martin, Bethesda, Md, is also participating in the M&S arena with its NxTrain, a Next Generation Training Solution. NxTrain allows team members to recognize, interpret, and make decisions quickly and correctly in complex and dynamic situations. The training focuses on simulated learning scenarios that provide experience and performance feedback in the cognitive skills needed to perform mission operations.

“The one aspect that differentiates our simulations from all others is scale,” says Wayne Civinskas, Manager of the Advanced Simulation Centers for Lockheed Martin Simulation, Training, and Support. “Unlike most games, which tend to confine the player to a small area, our simulations represent vast, complex spaces based on real-world data. Our goal is to be able to represent multiple theaters simultaneously; ultimately representing the entire earth at a level of detail suitable for training our future military command.”

In contrast to CGI-based software is the Virtual Experience Immersive Learning Simulations (VEILS) products from WILL Interactive, which use live-action video instead of animation. The platform allows users to become the lead character(s) in interactive movies in which they face real-life situations under real-life pressures, make real-life decisions, and experience real-life consequences.

“A unique ‘slice-of-life’ context is added to create a realistic computer-based learning experience,” says WILL’s Versaw. “VEILS then branches to different pathways and outcomes based on user decisions,” similar to the choose-your-own-adventure book series.

Their offerings include Gator Six, created for the U.S. Army, which helps improve artillery battery commanders’ critical thinking, judgment, agility, adaptability, and decision-making skills, while its Level III Anti Terrorism / Force Protection, developed for the Joint Chiefs of Staff, is used to train commanders in all services.

3-D modeling on-the-go

To properly immerse trainees in a realistic simulation, detailed, timely, and accurate pictures of the battlefield are required. Many sources of information are needed to build a “picture”—archival data, roadmaps, geographical information, and databases, all of which are static. Sensor information from mobile agents at different times and locations is also required, since the scene itself varies, with people and objects constantly moving. So, time itself must also be a component in battlefield visualizations. All of these factors gives rise to the need to generate 4-D models—three dimensions determining space and the fourth giving time.

Note: For photo-realistic walk-throughs, the VisMURI team created a real-time system mounted on a truck, which moves under normal traffic conditions. A vertical 2-D laser scanner (used to acquire geometry) and a digital camera (used to acquire texture) are linked to high-end PC, creating a horizontal 2-D scanner that performs complex timing synchronization.

Enter the Visualization MURI, a Multidisciplinary University Research Initiative (VisMURI), which aims to “develop the methodologies and tools required for the design and analysis of time-critical visualization systems, using augmented reality and 4-D dynamic models of the environment.” Researchers from the Univ. of California (UC), Berkeley, Univ. of Southern California, Los Angeles, Georgia Institute of Technology, Atlanta, Syracuse Univ., N.Y., and the Univ. of California, Santa Cruz, are jointly working toward this goal.

Led by UC Berkeley’s Avideh Zakhor, a professor of electrical engineering, the team has already created a data acquisition system and post-processing software which use collected data to build fast, automated, photo-realistic 3-D models of urban environments. However, the VisMURI modeling system differs from existing approaches in a few ways. For instance, most commercial systems use airborne imagery to create 3-D models of cities, where human operators then match features between two images by clicking on two points in each of the images that correspond to the same point. But this approach is slow and not cost effective, and does not result in a precise or accurate ground-based model of the city.

Modeling through the use of passive devices, such as cameras, is also plagued by the same problems as airborne matching approaches—in that it is too labor intensive to process data. In addition, these systems collect data in a stop-and-go fashion, which makes the acquisition process slow.

“Our system, as far as I know, is the first one that is fast in acquisition,” says Zakhor. “This is because we do ‘drive-by scanning’ and do not acquire the data in a stop-and-go fashion. Also, the processing of the acquired data is done entirely by laser scanners instead of cameras to compute 3-D depth. Furthermore, because we also take pictures for texture both from ground and airborne, our approach results in photorealistic models that you can then walk, drive, or fly through.

“The biggest challenge in all of these is to register all the imagery and laser data from both ground and airborne sources together so everything ‘stitches’ together the right way. So far, we’ve managed to do this without the use of any GPS (global positioning systems), simply because accurate GPS are too expensive and don’t always work in urban environments.”

How’d they do that?

To create highly detailed simulations with realistic terrain and sound effects, M&S designers must turn to different tools and their own talent pool for new ways to create these effects (with some researchers even looking to 4-D distributed modeling and visualization—see sidebar on page 18).

Most M&S companies use commercial off-the-shelf (COTS) tools, such as 3D Studio Max, Maya, and SoftImage. These tools generate realistic models, environments, and animation cycles for characters and vehicles.

“To develop a realistic terrain skin, you need to use a package that is more specific to terrain generation, such as Terrex,” says ECS’s Smith. “This type of package can bring in government-supplied digital terrain elevation data or digital elevation map files to create accurate, geo-specific terrains of real-world locations.”

BreakAway’s visualization tool Trex uses satellite data to generate accurate, synthetic urban environments within 24 hr. The resulting terrain can be used to create tactical simulations for en route training that can be played on a COTS console.

Warfighter Simulation (WARSIM) from Lockheed Martin relies on “real world” data provided by their customers to create the realism needed in simulations. “The effect that is important in the simulation, such as maximum range of a weapon system, is the accurate maximum range provided by the customer,” says Ed Payne, WARSIM’s Program Director at Lockheed. The models in the simulation also take into account terrain and personnel needed to perform a task to determine the performance of a system or organization.

Programmers for military simulations—including Lockheed’s—receive support from the National Imagery and Mapping Agency or National Geospatial-Intelligence Agency, both of Bethesda, Md., creating realistic worldwide 3-D terrain modeling.

But it’s not really “real”

While M&S systems provide very effective training applications, it could be said that in a simulation a user cannot experience the full psychological impact of urban combat.

“I often say that we can’t teach a person how to shoot a gun, but we can teach them when to shoot it,” says ECS’s Smith. “In other words, we may not be able to realistically simulate the complexities of urban combat, but we can simulate it well enough to teach a student certain aspects of it.”

Per Waldier, the Army still most often uses live simulations—where instrumented engagement systems are combined with “live” troops and actual equipment—to train individuals and small units, except in high-cost areas, such as combat vehicle crew and air defense.

Whether it is in the classroom or “live” on the field, the U.S. government is making sure its young men and women are ready for their battles ahead—without just relying on a roll of the dice.

—Lorraine Joyce

Resources (archived): 

  • BreakAway, Ltd., 410-683-1702, www.breakawayltd.com
  • Dept. of Defense’s “Models & Simulations Operation Handbook”
  • ECS, Inc., 877-823-9991, www.ecsorl.com
  • Lockheed Martin, www.lmco.com
  • Univ. of California, Berkeley, Video and Image Processing Lab, www.video.eecs.berkeley.edu
  • U.S. Army’s Battle Projection Center
  • WILL Interactive, 301-983-6006, www.willinteractive.com

Related Articles Read More >

NASA taps 100 million satellite images to train an open geospatial foundation model
Why Google DeepMind’s AlphaEvolve incremental math and server wins could signal future R&D payoffs
2025 R&D layoffs tracker tops 92,000
Is your factory (or lab) ready to think? An insider’s take on next-gen automation and what really works
rd newsletter
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, trends, and strategies in Research & Development.
RD 25 Power Index

R&D World Digital Issues

Fall 2024 issue

Browse the most current issue of R&D World and back issues in an easy to use high quality format. Clip, share and download with the leading R&D magazine today.

Research & Development World
  • Subscribe to R&D World Magazine
  • Enews Sign Up
  • Contact Us
  • About Us
  • Drug Discovery & Development
  • Pharmaceutical Processing
  • Global Funding Forecast

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search R&D World

  • R&D World Home
  • Topics
    • Aerospace
    • Automotive
    • Biotech
    • Careers
    • Chemistry
    • Environment
    • Energy
    • Life Science
    • Material Science
    • R&D Management
    • Physics
  • Technology
    • 3D Printing
    • A.I./Robotics
    • Software
    • Battery Technology
    • Controlled Environments
      • Cleanrooms
      • Graphene
      • Lasers
      • Regulations/Standards
      • Sensors
    • Imaging
    • Nanotechnology
    • Scientific Computing
      • Big Data
      • HPC/Supercomputing
      • Informatics
      • Security
    • Semiconductors
  • R&D Market Pulse
  • R&D 100
    • Call for Nominations: The 2025 R&D 100 Awards
    • R&D 100 Awards Event
    • R&D 100 Submissions
    • Winner Archive
    • Explore the 2024 R&D 100 award winners and finalists
  • Resources
    • Research Reports
    • Digital Issues
    • R&D Index
    • Subscribe
    • Video
    • Webinars
  • Global Funding Forecast
  • Top Labs
  • Advertise
  • SUBSCRIBE