Modeling and simulation are a necessity for nearly all scientists. Software vendors are on the hot seat in making their products measure up to demand.
Scientific software is now inseparable from most R&D efforts. Few engineers can imagine designing even something as simple as a bearing without the help of several design and analysis programs. The question these days is not whether to use software, it’s how many different software tools must be purchased and learned.
Productivity is crucial because nearly everyone uses this software. According to a recent study by computer scientists at the Univ. of Oslo, Norway, and the Univ. of Toronto, Canada, only about 4.3% of scientists, regardless of area of specialty, never use scientific software. More than 90% responded that scientific software is crucial to their work.
More than 80% of the 2,000 scientists surveyed from a variety of R&D fields spend at least 60% of their time using scientific software on desktop computers.
But this same survey also noted that almost as many respondents spent 60% or more of their time developing that software as well. Clearly, software is in a state of flux as constant improvements are engineered by both vendors and users. Simulation, analysis, modeling—all of these tend to now be delivered as modules that are easily modified and connected with other software products. Most researchers and engineers who use desktop computer-based systems don’t have the time to become experts in multiple tools, yet often circumstances demand that they do so.
Magnetic prospecting is a method of geological exploration applicable to certain types of iron ore deposits, in particular those made up of magnetite and hematite. This model derived from COMSOL’s MultiPhysics software is intended for aerial prospecting and is based on topographic data taken from a U.S. Geological Survey database. The color plot indicates the depth of the iron ore in relation to the crustal surface, while the streamlines indicate the magnetic flux. Image: COMSOL |
Multiple operating systems and multi-processing core machines must now be accounted for in the development of this software, and several major vendors are upgrading their offerings to meet the need for the average researcher or engineer to have access to an integrated and smooth software platform. These vendors include COMSOL, of Burlington, Mass., which will be introducing in September a new software product that advances ease-of-use for its Multi-Physics platform; SIMULIA, of Providence, R.I., whose Abaqus finite-element analysis (FEA) tools are now successfully employed by users outside of traditional design sciences; Maple, of Waterloo, Ont., Canada, which has expanded its mathematical engine toolkit with the introduction of a simulation engine; and NEi Nastran, a long-time FEA analysis and simulation provider that is integrating the use of parallel processing and automation routines to make its solver work faster than ever.
A move toward integration
The most common engineering solution, and the most familiar to the public, is computer-aided design (CAD). Geometric renderings in both 2-D and 3-D anchor most (by a weighty margin) of today’s manufacturing design efforts, and researchers also depend heavily on a CAD solution to establish geometry for a given need.
Though capable of saving a tremendous amount of time and effort in areas such as prototyping, CAD can’t provide design answers for many of the dynamic questions that face engineers. Even the most complex solution for a system in equilibrium does not provide the answers that developers need, and for this engineers often turn to multi-grid-based solutions like FEA.
The approach depends on separating the domain of interest into discrete regions (or more commonly, elements) of various polygonal shapes, which then can be solved using partial differential equations. The equations can either be solved completely (for a steady-state solution) or approximated into numerically stable ordinary differential equations that give designers the opportunity to greatly improve the model’s analysis of a complex domain. Before FEA, it was extremely difficult to model the dynamic forces on a complex piece of equipment, such as a car crashing into a wall or electronics exposed to rapid thermal changes. The result for an FEA user is a more precise final model, although the demands on computing resources from using FEA can be large. The demands on the user can be large, too.
The mathematical strategies underlying FEA—and related analysis techniques such as the finite difference method—have diversified tremendously and this tends to be reflected in the great number of toolkits available to software users. Companies who develop FEA solutions are continually adapting to include these advances. But, again, they must integrate these so that the user can easily make use of them.
This simulation of a bird strike on jet engine turbine blades uses an explicit Nastran finite-element analysis (FEA) code. Explicit FEA is used for simulating high speed impact and crash, large deformation, large strain, very complex contact, or models with material failure or material deletion. Image: Femap with NEi Explicit by NEi Software. |
FEA was first applied in to solve complex elasticity and structural analysis problems in aeronautics in the 1950s and soon spread to civil engineering as computing became more widespread. In the 1960s, other dynamic solutions began to mature, including computational fluid dynamics (CFD), which is based on Navier-Stokes equations that define single-phase fluid flow. Again discretization of the domain, in this case of CFD a continuous fluid, provides the ability to execute an algorithm that solves equations such as those for motion or heat transfer.
By now, numerical solvers developed using these discrete methods can be easily processed on the average desktop. But for many years they remained a separate solution, distinct from the more easily accessible geometry products and often incompatible with respect to interface, operating system, hardware requirements, and file type.
But now the emergence of multiphysics engines and cross-platform solutions is helping change this trend. In just the last five years, the number of researchers using sophisticated modeling and analysis software has increased tremendously, becoming a multi-billion dollar industry. Many of these same researchers started their research or design work using a CAD tool.
“This is something people have seen coming and everyone is aware of. The real benefit to us as a company is the huge market that uses CAD. CAE is smaller but growing,” says David Kan, vice president of COMSOL, Burlington, Mass. COMSOL provides analysis solutions for designers, often providing a crucial tandem solution with geometry packages such as CAD or CAE. Its multiphysics engine, with the support of a number of dynamic modules, is capable of simulating fluid flow, heat transfer, structural mechanics, electromagnetics, and other physical phenomenon.
“There are a lot of designs out there that need models. Every analysis model needs a geometry, so it’s always a part of the design considerations,” says Kan. “In fact, the first version of COMSOL included its own geometry engine and we still have one today. CAD really is part of the whole process.”
The tools that are out there for engineers who need precise 3-D models started their growth in the late 1980s and early 1990s, when engineers began leveraging the increasing power of low-cost computing.
The growth of multicore systems that can do multiphysics simulation on files has given rise to a tremendous reservoir of CAD files and companies like COMSOL have to connect to them. This has prompted some fast-growing analysis companies to acquire geometry resources to enhance their toolchain. Sometimes, these are developed in-house, and sometimes they are gained through acquisition.
COMSOL took their products in this direction by building them to directly support their MultiPhysics platform.
“With Solidworks and Autodesk, for example, you begin to have a file-based bi-directional interface, in which both programs operate simultaneously,” Kan says. This is the direction for COMSOL’s primary software products, too. Ease-of-use remains a high-priority for software designers, and the solution, clearly, is a work environment that combines these disparate tools into a unified environment—the one-window interface.
It’s a tall order; CAD programs are generally built on a very different platform than an analysis engine, which relies on an often unique mathematical foundation. The approach to solutions can be linear or nonlinear, and finite element methods can widely differ depending on the equations used and the meshing performed.
Still, the potential for a do-everything program is one that could greatly speed the development process. For example, says Kan, an automotive company designing a door frame will need far more than a 3-D model of the product. They will want to look for dynamic contact of the seals going together, and also the hyperelastic interaction of the seals.
Another example is in the oil and gas services sector. Sensors are often placed underground to locate petroleum deposits, but there is a significant challenge to design them so that they function properly under difficult environmental conditions. It’s not only the mechanical considerations of high pressure and/or high heat, says Kan, it’s also the electromagnetic profile that must be carefully considered during the design phase. Most of these sensors rely on electromagnetic signatures to perform properly.
“COMSOL benefits from integration with CAD, but in the near future CAD benefits from its relation with us,” says Kan, pointing out Autodesk’s recent acquisitions which have added analysis capability to that company’s CAD tools. “In the near future, CAD programs themselves will be able to execute optimization routines.”
From solvers to simulation
Maple’s core product is a general purpose mathematics tool, but it has in the past year added connectivity to CAD systems to enhance the engineering design process. On the modeling front the company has released MapleSim, now in its second version, to provide an integrated platform that allows designers to quickly model results they have developed in the mathematics engine.
“Our goal is to make that power accessible to a broader base of researchers. We want it to be easy to use, feature interactive assistance, present math in a natural way, and allow users to enter mathematics in a user-friendly application,” says Laurent Bernardin, chief scientist and vice president of R&D, Maple.
The basis of Maple is a sophisticated mathematics-based computation engine based on symbolic and numeric solvers that have been steadily refined over the years. This is supplemented by thousands of aggregate modules from across science and engineering. In addition to the recent release of the 13th iteration of the Maple engine, the company has also recently launched MapleSim2, a program that interprets the solutions delivered by Maple and renders them visually through a graphically-inspired interface that obviates the need to memorize textual syntax.
“With MapleSim2 we took that idea one step further, which still allows you to get access to the advanced math engine, but also lets you to build simulation physical models just by dragging and dropping and connecting them up,” says Bernardin. “It’s one step further. You are getting simulation results without having to deal with equations.”
Users can take an assembly from the CAD system, bring it into MapleSim and generate 3-D animations.
“On the one hand, you have the traditional way of analyzing behavior of a model of CAD system in fine-element analysis or CFD. But even today FEA takes several hours to run in the best case. There is definitely a need to do system-level simulation on these CAD models to quickly get a sense of the overall behavior,” says Bernardin.
An added advantage for Maple was the discovery that their well-developed mathematical engine helped speed the simulation process by processing the equations directly before the numeric simulation solving stage. The engine itself has also been refined such that it is very different now than it was even four or five years ago, says Bernardin, and it has as much to do with ease-of-use as fundamental performance.
A more user-friendly FEA
FEA has made in-roads in a wide—and unexpected—range of industries, from pure R&D to Hollywood.
Used to design a more effective face mask for general purpose industrial use, these visualizations represent contact pressure contours as an estimate of sealing effectiveness of a dust mask on a face at various points in time. Image: SIMULIA |
Aerospace is the most obvious high-value end use. NEi Software was one of the original companies to apply FEA, and its software is based on code written for NASA. In the late 1980s, the company’s founder, Dave Weinberg, re-wrote the original NASTRAN code so that it would run on a PC. In addition to using more modern programming languages, Weinburg also introduced a modern, modular code architecture, which allowed changes and modifications to be made faster and easier. The NEi Nastran Finite Element Analysis Solver has since served the company well, according to Director of Marketing for NEi Nastran Dennis Sieminski, and has been greatly developed since 1990.
“The strong foundation in precision, the modularity and modernity of the code are still advantages today along with the fact that the code ‘pedigree’ has been maintained, as opposed to being an amalgamation of disparate company codes as can happen with mergers and acquisitions. This has allowed for integration of linear and nonlinear analysis in one program, good execution speed, and again the ability to easily add new simulation technology,” says Sieminski.
While by the 1990s PCs were up to the task of processing complex FEA solutions, they were still a time-intensive process that limited practical use for anything other than accuracy-critical applications. But now that is now longer true, says Sieminski, and NEi’s capabilities are profitably applied to everything from medical devices to sporting goods. There is pressure now, he continues, for OEMs to use FEA, particularly those companies using expensive materials.
And FEA has also entered the non-scientific realm, finding use in architecture, video gaming, and moviemaking. Even the design of a dust mask, accomplished by Kimberly-Clark Corp., Neenah, Wisc., has benefited greatly from the FEA resources of Abaqus, a simulator from SIMULIA under parent company Dassault Systèmes, Paris, France.
The process used to design the mask was derived from the facial modeling technology developed for use in Hollywood films. The use of meshes has long been a strategy to simulate the complex movements of the human face, but few approaches delivered the accuracy needed to convey a convincing simulation. A marker-based motion capture system that is now used widely used employed 30 to 200 reflective markers applied to an actor’s face and an array of cameras captures facial movements and triangulates to determine location. But this method doesn’t provide much resolution.
The design challenge with the mask was to make it comfortable and at the same time maintain an airtight seal against the changing shape of the face.
“We’re not worried about the strain of the materials in the product,” says Chris Pieper, associate research fellow at Kimberly-Clark. “However, it’s crucial that the mask conform to the face, and the contact pressure between the mask and the face is very important to the proper function of the product and the comfort of the user.”
Pieper would once have been forced to go to incompatible software resources to get the project done. But the toolchains for such a process have converged to the point that just three different tools were used. Counter Reality Capture from Mova LLC, Palo Alto, Calif., was used to capture 100,000 3-D data points at 1-mm accuracy. These were defined as nodes for finite element definition using surfacing software called Geomagic; then the nodes and elements were written to an Abaqus import file using a Python program.
“We look to these simulations to help us narrow the field of design possibilities, so that when we do testing with human subjects, we are only looking at the design finalists,” says Pieper, who is still in the process of validating the mask design. “That can really shrink the product design cycle.”
The applications of FEA, then, are expanding. But what about the underlying mechanics of FEA? How will it develop to meet these widely variant needs? Because of the nature of this type of analysis, says Sieminski, accuracy is an ever-present goal. This can be attained using a better quality mesh and more elements, which in turn requires more time to solve and more processing power.
“We are always working on ways to achieve better accuracy,” he says. NEi Nastran recently added a new triangular plate bending element, for example, which is significantly more accurate for a coarser mesh.
“People in general want large models with more complexity to run in a faster period of time. Ideally, they would take a complex CAD model, specify the materials, loads, and boundary conditions and run it with results occurring instantaneously. We are working on better and smarter meshing technology in NEi Fusion, adaptive meshing, which automatically adds more elements in areas with higher stress gradients,” says Sieminski.
The company is also using routines to access parallel processing in areas where it is effective to further enhance speed.
Multiphysics meets multi-core
Finite-element analysis, which first answered tough questions in aerospace design, has proved so useful it is now relied on for consumer product designs like this dust mask. Software vendors are striving to make their analysis products easy to use for these types of applications. Image: SIMULIA |
An interesting twist in the rapidly transforming simulation field is the presence of processing chips with multiple cores. Dual and quad cores are now common, and chip companies are poised to continue this progression.
This puts software vendors on notice. The old way of processing programs sequentially is being thrown out the window for certain types of problems. Because efficient processing is crucial for simulators, they are among the first commercial software tools to take advantage of the newest semiconductor architecture.
“The ability to solve 2-D problems has been around for 10 years. Then with the ability to solve 3-D problems, it really kind of pushed the limit as far as what 32-bit computer platforms could do,” says Kan. The advent of 64-bit and multicore systems has ushered in the ability to perform heavy-duty analysis programs.
“For doing more and more complicated problems, that progress won’t stop,” he says.
At Maple, the push to multi-core has been accelerated through its relationship to National Instruments, whose LabVIEW graphical design software takes advantage of the parallel-processing capabilities of field-programmable gate arrays. The Maple engine can handle LabVIEW system files, adding the simulation function through MapleSim.
It’s not necessarily the high-end R&D applications that give cutting-edge computing technology the push into new territory. Gaming, for example, was the impetus behind cell-based computer-chip architecture that was quickly co-opted by the research and high-end engineering community to execute massive, complicated problems. The world’s first peta-scale machine, the IBM Roadrunner, in fact, is a hybrid of cell-based processors and more traditional CPUs. That sort of diversification, says Kan, will continue to happen.
It’s instructive to look at the progress of software as compared to hardware. Hardware, says Kan, has generally followed Moore’s Law of development in that a geometric progression has been seen in relation to cost. “It turns out that algorithmic development has also followed a geometric progression. That’s the double benefit of hardware and software. I don’t think we’ll ever say we made the Holy Grail of solvers, but as new platforms come out they will benefit from improvements in the software,” says Kan.
Another shift is the widespread adoption of the concept of the application program interface, or API, with which people can customize the modeling program to their own needs.
There are still other ways to speed solutions for engineers. For example, COMSOL has recently launched a no-fee networking site: COMSOL Community. As one would expect, the Web site offers standard online tools, including a bulletin board-style forum, blog, and technical papers submitted by users.
Because COMSOL’s software relies on materials models, the concept of an online “model exchange” is a sensible way to help engineers avoid duplicating work by retrieving models that work for them. Though it is by no means the only software company to make similar efforts to reach to its users, the value in keeping designers from duplicating work or wasting time is apparent.
More importantly, the COMSOL is also releasing its next-generation MultiPhysics 4.0 product later this year, which will feature a unique user interface that makes it easy for users with all levels of physics modeling expertise to build and run simulations.
Many scientific software companies now offer conferences and online resources, but direct support is hard to replace, says Sieminski. NEi’s Weinburg, he says, “worked as an engineer in aerospace and from this experience developed a keen sensitivity to the need for technical support that was both professional and timely.”
At Maple, Bernardin says their efforts to help engineers by expanding their expertise outside the mathematical solver has paid off.
“I think we have done well so far. Simple things aren’t always easy to do. You don’t want to overwhelm the user with complexity up front, but you also don’t want customers to run into a wall when they want to do something complex,” he says.
As a result, consolidation has been the trend, and Kan believes this trend has been “massive” in the last three or four years. “But I don’t think there will wind up being one tool out there. The question is whether CAD and analysis will stay separate or meld together,” says Kan.
RESOURCES
COMSOL, Burlington, Mass., 781-273-3322, www.comsol.com
Maple, Waterloo, Ont., Canada, 519-747-2373, www.maplesoft.com
NEi Nastran, Westminster, Calif., 714- 899-1220, www.nenastran.com
SIMULIA, Providence, R.I., 401-276-4400, www.simulia.com
Published in R & D Magazine: Vol. 51, No. 4, August, 2009, p.8-12.