A new era dawns in finite element analysis: skeptics ask the experts, ‘How do you know?’
By Jack Thornton
Finite element analysis—the simulation and testing that computers have made possible—has migrated over the years from the rarified world of postdoctoral physics into everyday product development. Today FEA is deeply embedded in design processes and nearly ubiquitous in engineering analysis.
Capabilities and applications have multiplied, and at the same time have presented some significant challenges to those who rely on computer-assisted engineering. It is a challenge to find the best way to simulate an object’s design and the forces acting on it, amid time and budget limitations. Meanwhile, the developers of FEA software and computer hardware continually leapfrog each other introducing advanced products into the market.
According to its practitioners, FEA is useful wherever the risk of material failure or engineering error has serious consequences—any of the legal, regulatory and bottom-line ramifications of product failure. They also point out that credibility lies at the heart of the every simulation effort, FEA or otherwise.
This analysis shows the displacements at the natural resonant vibration frequency of key structural components of a piezoelectric energy harvester.
Understanding FEA means a journey into stresses, strains, fatigue, materials science, and the myriad of mathematical ways their interrelationships are represented, including kinematics.
We asked the leading FEA software developers to put us in touch with engineering analysts dedicated to FEA. Many of these analysts have earned doctorates in engineering and some hold Professional Engineer licenses.
In a series of interviews, these independent consultants discussed key aspects of building FEA models to overcome engineering challenges and some of the issues that arise in the practice of their work. One of the issues arises from unrealistic expectations of non-specialists in the field.
“It is unfortunate but too many engineering managers do not understand the vast complexity of the physical world that FEA addresses,” said Ted Diehl of Bodie Technology Inc. in Unionville, Pa. Diehl is a user of Abaqus from the Simulia unit of Dassault Systèmes. “Getting the right physics in the model is step one, and unfortunately not all analysts spend enough time and attention on this. Worst case, they compute models with inappropriate or overly simplified assumptions”— that is, with some of the physics missing.
“Engineering problems often come to us with an expectation of getting just a number from an analysis, or just verifying a number they already have,” Diehl said. “Unfortunately, a number is a just a point and can’t tell you very much. Usually one or more mathematical curves are needed so that any single number is evaluated within some context. That’s part of the service we provide.
“Especially in nonlinear or transient analyses, looking at just a single number without context can greatly misrepresent what the analysis is telling you. We offer that understanding. It is essential to make sure that an initially blinkered approach doesn’t lead to greatly underestimating what FEA can do.”
Forces to Reckon With
By computerizing engineering analysis, FEA is designed to save money and time. Strictly speaking, it is a digital way to test designs against predictable forces. Nearly all FEA solutions determine whether a design will fail and, if so, when and how the material will deform, snap, or collapse. Some analyses needn’t go that far, experts point out. They simply verify that a given design, including its materials, components, and connections, will withstand all reasonably foreseeable forces.
“The increased need for FEA is being driven by budget and time constraints,” said Jeff Crompton, principal of AltaSim Technologies LLC in Columbus, Ohio. “Any mechanism that allows developers to reduce the cost and time for product or process development is being seized upon by engineering managers.”
At the same time, he noted, “FEA software developers have made it easier to use their software via user friendly graphical user interfaces. The problem is that these GUIs allow inexperienced engineers to use FEA without knowing if they formulated the problem correctly. The result can, at best, be misleading but may also be absolutely wrong.”
Model validation is essential, Crompton said, “if you are to have confidence in the analysis. Extensive use is made of FEA to develop new technology.” Yet without good validation, “it is impossible to explore new avenues with any degree of confidence.” AltaSim uses Comsol Multiphysics from Comsol Inc. as well as Abaqus software.
FEA is a uniquely powerful tool for prototyping, reducing the traditional build-test-break cycle from months or even years of trial and error to weeks of digital calculations and validations. FEA does not eliminate the need for prototypes, but it can shorten the process. Often only one or two prototypes need to be built and tested before anything new goes into production.
Digital prototyping also allows designers to quickly dig into more design options. A dozen or more functional variants of most new designs are commonly examined with FEA models. Before FEA came into widespread use, there was rarely time or budget for more than one or two variants.
This fatigue-stress discovery simulation of a miniature pistol includes material plasticity and dynamic surface-to-surface contacts.
As a central part of engineering analysis, FEA also helps ensure against risky under-design and costly over-design. Many critical products are brought to market with FEA—surgical stents and oilfield connections, to name just two. They highlight the challenges that FEA experts tell us they study every day—extremely small dimensions in stents and extremely high pressures in connections.
“FEA solves engineering problems by breaking down the complexity of mostly geometry-related details (size, shape, material and forces) into manageable idealized pieces that look like Lego blocks,” said Terry Bender of Applycon in Hamel, Minn. “Then the theoretical or ideal aspects of engineering can be applied.
“The shortcomings in FEA usually arise in making the mental jumps between theory and the FEA model, between reality and virtual reality,” Bender added. He uses Algor software from Autodesk Inc.
According to Bender and several other experts, most new FEA users are design engineers. Though highly trained and well educated, many do not have a specialist’s familiarity with the finer points of nonlinear mechanics, fracture, fatigue, creep, yield, and phase transformations . These physical phenomena are central to FEA.
Completed analyses and post-processed results make for dramatic and colorful screen images, but those images are really only a graphical user interface for tens of millions and even hundreds of millions of calculations.
This means engineering managers and decision makers looking over modelers’ shoulders can go from uninformed to misinformed without realizing it. Incomplete or inaccurate models and analyses could mislead managers and decision makers unintentionally.
Instead of performing costly and time-consuming physical testing of an aluminum automobile wheel, impact tests per the SAE-J175 specification were conducted using highly nonlinear analysis.
Any model, whether built inside a computer, carved in a physical medium (wood, wax, metal, stone, etc.), or laid out on paper, is an abstraction from reality. “As abstractions, FEA answers questions about risk and what-ifs,” said Steve Cosgrove, vice president of technology and support at Acusim Software in Clifton Park, N.Y.
“If the fine points of the physics are ignored or are represented inaccurately , the model has no credibility, no matter how compelling its graphics may appear,” he said. He added that “physics-free simulations are not useless. They can be fine for reconstructing an event or demonstrating a complex process. But absent the correct physics, they are just dramatizations. That’s all they can ever be.”
Further, as FEA software becomes more powerful, it becomes more complicated. In turn, complexity always challenges the in-depth understanding that is fundamental to establishing credibility.
The Perennial Challenge
The biggest challenge in FEA is validation, carefully chosen and closely monitored physical tests that confirm whether or not physical reality and virtual reality line up . A consensus among FEA analysts is that validation ensures that there are no hidden disconnects between the model and the physical testing, that correct physical properties are used, and that properties are analyzed accurately based on correct principles of physics.
“It is a universal truth: the more dire the consequences of inaccurate FEA results, the more time, effort, and money should be spent validating that model,” said Carl Howarth, of Carl Howarth Associates in Bloomfield, Mich. “Simulations have become so incredibly comprehensive that it is no longer possible to establish confidence in these models by using simple engineering judgment or hand calculations.”
According to Howarth, “Most of my analyses are structural, which is arguably the easiest to validate. Half of my customers now insist on some type of model validation for an existing design prior to predicting the performance of designs in a competing material or process.”
This is relatively new, he said. It is driven by tight budgets and short timelines, of course, but also, as he put it, “by widespread prior experience with analyses that resulted in inaccurate predictions of stress or deflections.”
This quasi-static, large-deformation analysis of a generic coronary stent shows the maximum principal strain after simulated balloon expansion inside the coronary artery.
In metal-replacement —converting metal components to engineered-plastic parts—“validation can be done by testing an existing component,” Howarth said. “The proposed analysis is then compared to the results of basic physical testing. If overall strength and deflections are predicted well in the test case, then credibility is established.” Howarth mainly uses FEMAP pre- and post-processing from Siemens PLM Software and the NEi Nastran solver from NEi Software Inc.
“Despite the complexity of FEA models, it will always remain necessary to provide some basis for believing the results that these models provide,” Howarth said. “If we, as engineers and scientists, begin to blindly accept results because validation is too difficult, then we have become scientific soothsayers whose predictions can only be proven or refuted over time.”
Validation is critical to earning the U.S. Food and Drug Administration’s regulatory approval for implantable devices, said Kenneth Perry, principal of EchoBio LLC in Bainbridge Island, Wash. Perry develops heart pumps and valves, devices implanted in the brain, and stents.
“Medical device designers strive to meet FDA and surgeons’ stringent demands,” Perry said. “A typical FDA requirement is to demonstrate 95 percent reliability of a device with a 99 percent confidence level. That usually means 400 million test cycles, and sometimes more, focused on fatigue life for the worst-case-loading condition. To complicate matters, it is not always clear which loading mode or combination of loading modes is going to be most critical to the safety or performance of the device.”
Validation challenges increase with the use of materials like nitinol, a nickel-titanium alloy with extraordinary shape-recovery properties, but too little good physical data. “To get a stent in place may require fully recoverable elastic deformation of 8 percent,” Perry said. That is far beyond what any other metal used inside the body can withstand.
“For validation, we do lots of testing on devices, but even more on coupon samples to establish material-limit data,” Perry said. “We then compare model predictions of fatigue life with experimental results for different loading conditions. MRI and CT medical imaging provide excellent data.” Perry does his work with Abaqus.
Validation has its limits. According to Gene Mannella, vice president of GB Tubulars LLC, Houston, “My challenge is that some things cannot be modeled. For example, electroplating applied for lubricity and sealability cannot be modeled, yet all box threads have some sort of surface finish.” GB Tubulars distributes oilfield casing from leading producers and designs the barrel-shaped connections that thread them together for use in oil and gas wells.
The two-dimensional axi-symmetric analysis of a full set of mating threads shows the compression and deflection as a length of oilfield casing (purple) and a connection or coupling (teal) are screwed together.
Oil and gas operations are characterized by “extreme service demands miles underground, a wide range of load combinations, a large array of materials and lubricants, and tough work environments at drill sites,” Mannella said. “Combined with variables in machining threads, such as thread dimension tolerances, and surface finishes, all this makes analyzing these seemingly simple components quite complex.”
According to Mannella, “We assume that connections, casing, and tubing are perfectly round, have uniform thickness, and that materials are homogeneous and isotropic. None of this exists in the real world. At this level of FEA, it is essential to have a clear and thorough understanding of how the parts actually interact.” For most of his failure analysis Mannella uses ANSYS Structural.
“I have seen designers build beautiful three-dimensional models to investigate tolerance stack-ups to assure parts will actually fit together,” he continued. “Even with the most precise modeling, machined parts that should fit still don’t, because of factors that cannot be modeled such as ovality and eccentricity. These occur in even the most precise machining operations.”
Despite the challenges, Mannella is an optimist. “Even in this imperfect world, better parts are made, more parts fit, and fewer mistakes are made—all of which increases productivity and profitability.”
‘E’ Is for ‘Element’
Models are built up from finite, discrete elements—anywhere from a few thousand to hundreds of millions. Analysts note they occasionally build models with just a handful of elements, but those are rare and usually more to demonstrate a concept than to solve a real-world problem.
There are hundreds of different elements, and together they cover most, perhaps all, possible calculations of mechanical properties found in the real world. They include lines, shells, 2-D planar solids, planes (stress or strain), shear panels and membranes, 2-D axi-symmetrics, 3-D surfaces and solids, plates, beams, triangles, wedges, tetrahedrons and bricks, hexagons, mass and general stiffness matrices, plus many contact elements. There are even one-dimensional elements—springs for tension, for example.
This close-up view of a complex bolted-joint model shows nonlinear analysis,
including friction, for some 100 bolts and washers.
Any given element has many flavors differentiated by nodes and degrees of freedom. Most simple geometric elements have up to three degrees of freedom per node—displacement, rotation, or both. Complex elements have as many as 27 nodes on corners, midpoints of sides, and even midpoints of surfaces. A node is any point on an element that transfers loads to or from an adjacent element or the model’s “outside world.”
In addition to the geometric shapes, hundreds of other elements have been created for specific problems—for pressure or bending, for example. Other elements represent plastics, rubber, ceramics, concrete, masonry, aerospace composites, and other materials.
Some of these elements are “idealizations” that embody a particular engineering concept. Some idealized elements have more than 100 nodes and as many as 11 degrees of freedom per node, analysts pointed out.
Meshing converts design geometry into FEA elements, those digital Lego blocks. Analysts and modelers have to balance the number, variety, and complexity of elements against time to solve. FEA is “computationally intense,” a CPU and disk-space hog, and the biggest variable is the mesh. In FEA, mesh is the network of elements linked at their nodes and with loads properly applied; it is there that the calculations are done.
The more dense the mesh, the more realistic and more accurate the solution will be. But even on the speediest hardware, very large FEA models may require days of nonstop number crunching. FEA calculates effects of loading at each node of every element with simultaneous differential equations, anywhere from thousands to millions of them.
Meshing is partly automated in many FEA packages, which saves a great deal of time in model building.
Boundaries and boundary conditions are exterior surfaces of objects that are modeled and the forces acting on them. Boundary conditions often get too little attention, according to Jim Richmond, president of Mechanical Design Engineering Consultants in Newbury Park, Calif. Part of ensuring the integrity of the model requires “comparing boundary reaction forces to the input forces and verifying them to be equal and opposite,” he said.
“Rigid boundary conditions is a common error in FEA modeling,” Richmond said. “In reality, rigid boundaries do not exist. And flexible boundaries should always be investigated for their impact on the analysis.” Richmond mostly uses Femap and NEi-Nastran.
As in any other engineering analysis, FEA model builders often have to deal with unknowns such as exact loads on the exterior surfaces of modeled objects. The solution is straightforward: “Make sure that worst-case and best-case models are built,” said Bender at Applycon. “Then the analyst always errs on the side of safety.”
Sensitivity is the model’s responsiveness to changing conditions. “Sensitivity analysis is a good way to check loads, boundary conditions, and other parameters,” said George Laird, principal mechanical engineer at Predictive Engineering Inc. in Portland, Ore. He offered two examples:
If boundary constraints are slightly modified, does the model “jump” to a new state or adjust in small ways
If a load is moved slightly, does the load path change?
“Sensitivity analysis represents the modeler’s internal documentation that due diligence has been done,” Laird said. “Every major modeling project should have a section on sensitivity as part of the validation of modeling assumptions.” Laird uses Femap, NX Nastran, and NX Simulation, all from Siemens, as well as LS-DYNA from Livermore Software Technology Corp.
Coefficients of friction “are tricky and rarely predictable,” Laird continued. “If friction is important in the problem, the structure should be modeled with both a frictionless interface (freely sliding surfaces) and with the interface frictionally locked. Engineering reality will be somewhere in between.” In other words, friction between moving parts in an FEA model is an approximation unless the parts’ surfaces are locked or slide freely. Crompton from AltaSim added that “defining values for coefficients of friction is more an art than a science.”
“It is the job of the analysts to determine if they can live with that,” Laird noted. “If the modeler uses sensitivity analysis to change certain variables and see how they affect the results, then friction is a star player in any assembly modeled with sliding and contacting.”
A 300-inch diameter processing vessel for nuclear waste was simulated as part of ASME design certification. The analysis, which shows the effect of seismic forces on the waste containment vessels, was done in support of the cleanup at the U.S. Department of Energy complex at Hanford, Wash.
Coupled analyses are a major FEA innovation for analyzing complex processes or phenomena such as fluid-structure interactions. “The flutter of an airplane wing or the oscillation of a structure during high winds are good examples, as are heat flux problems,” said Darrell W. Pepper, director of the Nevada Center for Advanced Computational Methods at the University of Nevada, Las Vegas. (Pepper’s unit is part of the National Supercomputing Center for Energy and the Environment, also at UNLV.)
This is multiphysics, and it addresses the fact that “most real world problems are multidisciplinary, meaning that multiple interactions occur simultaneously,” Pepper explained. A professor of mechanical engineering at the university, he uses Comsol.
Pepper said that, in the past, “we would freeze certain aspects of a problem, usually by locking in boundary conditions, and then run one aspect of the problem as a transient case with fixed geometries.” The real world does not, of course, work in such a one-thing-at-a-time fashion.
Coupled analyses and multiphysics present many leading-edge analytical challenges. These include multiple and varying time frames, steep gradients in the ways that meshes are refined, potentially incompatible boundary interfaces, and simulating transient and turbulent fluid flows, as well as incorporating additional FEA codes.
Acoustic analysis (top) shows how a loudspeaker enclosure and its placement in a room affect sound. The thermal stress in a turbine blade (middle) and the absolute displacements in an automotive constant-velocity joint (bottom) can also be simulated.
Multiphysics represents a dramatic advance in FEA. “The ability to analyze interdependent phenomena lets us address problems previously considered intractable,” said Crompton from AltaSim. “The complexity of the physics that can be addressed in a simulation has significantly expanded FEA’s potential. Multiphysics analyses of ‘real world’ problems are vital to the continued implementation of new technology.”
Multiphysics is an especially heavy user of computer resources. At UNLV, for example, Pepper runs Comsol on three powerful Dell Computer Corp. PCs. The Intel Corp. Pentium D, Core 2 Quad, and Core Quad i7 processors have clock speeds of 2.5 to 2.8 gigahertz with 4 to 12 gigabytes of random access memory. The Quad i7 and Core 2 machines use 64-bit Windows 7 and Windows Vista operating systems, respectively, from Microsoft Corp. The Pentium D machine uses 32-bit Windows XP.
Linear and Non
Nonlinear FEA includes elastic and plastic transformations; tension and compression; buckling; fixed and sliding contacts, fatigue, creep, large deflections and deformations; large strain; hyperelasticity, viscoelasticity, viscoplasticity, and many others.
Most complicated engineering analyses use nonlinear FEA for these challenging problems. The forces become sufficiently great to transform the stiffness of the structure from a constant to a variable. The structure will deform, twist, buckle, or break, often suddenly, and that behavior is nonlinear.
Nonlinear FEA requires iterative calculations with sets of differential equations. They gobble computing resources but allow for more accurate and realistic physics to be simulated. Linear FEA needs no cumbersome iterations but is limited to small displacements, strains and rotations and boundary conditions that are known and unchanging. The material responses are small enough to be linear.
In the real world each of these is problematical. Getting them right depends heavily on the analysts, so their skills should be also validated and verified.
“Compared to the huge penalties of failure, which can quickly run into millions of dollars, spending $10,000 or $50,000 on FEA is almost always cost-effective,” said Bender at Applycon. In an imperfect world, “there are no exact answers, just bad ones, good ones, and better ones. Engineering is the art of approximation,” he said, which means for ensuring credibility, “There is no substitute for experience.”
Jack Thornton is the principal of Mindfeed Marcomm in Santa Fe, N.M., and a frequent contributor to Mechanical Engineering.
For .pdf materials relevant to this course, follow this link: https://1drv.ms/f/s!AgbbD-KyVrKGhdJG9j0-t9TtWdJvpw
Let me know if there are any issues and show some activity by making comments on this page.
Diary and notes on
Postgraduate Systems Engineering.
SSG 805 Test:
Time: 10:00 to 12:00 Monday, June 4, 2018
Location: to be determined by Mrs Folorunsho. Look for her in Department Office.
In this class, we will introduce the computations involved in graphics transformations by discussing a tensor called the vector cross. The required vector analysis background is reviewed as a repetition to assist those who may have forgotten and refresh your minds. The summation convention is used and a symbolic algebra tool like Mathematica will make life a lot easy for you. Please download these two notes in the event that we have no power on Thursday.
The above discussion will be needed to go through the worked examples presented here:
We continue with some of the unfinished business of last week. It is important to get the definitions of the Kronecker Delta and the Levi Civita symbol correct before we move further. Today lecture will take it from there to tensor algebra.
Let me have your comments below. Note that this week’s assignment is due on Monday at 10:00am