Analysts working on finite element models can spend a great deal of time obsessing over their meshes. If they use too many elements, a model may take a long time to run. But if they don’t use enough elements, solution accuracy may suffer.

Balance is essential. Your mesh must be complete enough to provide an accurate solution, without being so large that it takes too long to run.

**Assessing a mesh**

Finite element preprocessors have come a long way over the years—to the point where users with minimal training can create meshes that appear good enough based on their element density and distribution. But how can one really know if a mesh is good enough for an analysis?

Quite simply, these meshes produce results with an acceptable level of accuracy, assuming all other inputs to the model are accurate. Mesh density is a significant metric used to control accuracy (element type and shape also affect accuracy). Assuming no singularities are present, a high-density mesh will produce results with high accuracy. However, if a mesh is too dense, the overall number of elements and resulting degrees of freedom will be high, requiring a large amount of computer memory and long run times. This can be a problem for linear analyses. And this problem is magnified for multiple-iteration runs typical of nonlinear and transient analyses.

One of the ways to evaluate the quality of a mesh (and a model overall) is to compare results to test data or to theoretical values. Unfortunately, test data and theoretical results often aren’t available, so other methods are needed. These include mesh refinement and evaluating results discontinuities.

The most basic and accurate way to evaluate mesh quality is to refine the mesh until a critical result such as the maximum stress in a specific location converges: meaning that it doesn’t change significantly as the mesh is a refinement.

Figure 1 shows an example for a 2-D bracket model. In this case, the bracket is constrained at its top end and a shear load is applied to the edge on the lower right. This generates a peak stress in the fillet, as shown.

The figure shows the peak stress in the fillet increases as the mesh density increases. Ultimately, increasing the mesh density further produces only minor increases in peak stress. In this case, an increase from 1,134 elements per unit area to 4483 elements per unit area yields only a 1.5% increase in stress, suggesting the mesh is likely good enough at about 1,134 elements per unit area.

The problem with this method is it requires multiple remeshing and re-solving operations, which is fine for simple models, but can be time-consuming for complex models.

**Evaluating discontinuity**

Another option is to evaluate the magnitude of the discontinuity in critical results between adjacent elements. In most cases, the finite element method computes stresses directly at interior locations of the element (Gauss points) and extrapolates them to the nodes on the element boundaries.

While it’s common to view these results as average values, the reality is each element calculates different results at shared nodes, leaving a discontuity, as illustrated in Figure 2. The degree of discontinuity decreases with improving mesh quality, so this metric can be used to help determine if a finer mesh is needed. In many cases, a finer mesh can also improve element shapes.

Figure 3 shows unaveraged stresses in the fillet region of the bracket used in the earlier example. Stresses for both a relatively coarse mesh (left) and a relatively fine mesh (right) are shown. The percentage values listed in the figure indicate the relative stress discontinuities between adjacent elements at the surface of the fillet where the stresses are highest.

These values were calculated by taking the differences in the unaveraged stresses at the shared surface nodes and dividing them by the corresponding nodal averaged stresses. The finer mesh shown on the right exhibits much lower discontinuity values, indicating that this mesh is considerably more accurate than the coarser mesh on the left. The percentage difference also indicates the degree of potential error in the solution. While different finite element codes may provide a variety of other error measures, they are generally all based on these results discontinuities, so this is really the most fundamental means of estimating mesh quality outside of iterating on mesh density as described previously.

It should be noted that large relative stress discontinuities in many regions of a model are not necessarily a cause for concern. In fact, it is quite common and perfectly acceptable to have high stress discontinuities in regions of a mesh as long as they are far from critical locations where accurate stresses are required. The “non-critical” regions can then contain a coarse mesh which helps to reduce the total element count in the model.

It’s important that the analyst determine if a high degree of accuracy is required in a given region and, if it is, to evaluate the quality of the mesh in that region. Mesh quality is critical to overall model accuracy and can ultimately mean the difference between predicting that a design will or will not fail.

**• CONFERENCE AGENDA ANNOUNCED:**

**The highly-anticipated educational tracks for the 2015 R&D 100 Awards & Technology Conference feature 28 sessions, plus keynote speakers Dean Kamen and Oak Ridge National Laboratory Director Thom Mason. Learn more.**