OpenSees Cloud
OpenSees AMI
Secant Accelerated Newton Algorithm
Original Post - 18 Aug 2024 - Michael H. Scott
Visit Structural Analysis Is Simple on Substack.
I do not consider myself an expert with numerical methods. I know just enough to be dangerous, and root-finding algorithms is one of the subjects where I pose a threat.
OpenSees uses root-finding algorithms like Newton-Raphson and Modified Newton to solve for the nodal response for which equilibrium is satisfied at every analysis time step. No one algorithm works for all analyses and there’s always a tradeoff of convergence rate and computational efficiency. Under ideal conditions, Newton-Raphson is fast and expensive while Modified Newton is slow and cheap. When conditions are less than ideal, these algorithms may not converge.
Accelerated Newton algorithms attempt to improve the slow convergence rate of Modified Newton without much added computational expense while also dealing with non-ideal conditions like an inconsistent tangent, e.g., due to lazy coding in Concrete23.
Many accelerated Newton algorithms have been developed, but only a couple have been implemented in OpenSees. While the Krylov-subspace accelerated Newton algorithm of Miller and Carlson is fairly well known among OpenSees users, the secant-based accelerated Newton algorithm of Crisfield receives less, if any, OpenSees use.
Crisfield describes the secant-based acceleration as a “memoryless BFGS method” and there are one-term, two-term, and three-term variants of the algorithm. But unlike BFGS, the secant-based acceleration does not update the stiffness matrix. Optional “cut-outs” can inform whether a regular modified Newton iteration is a better choice than accelerating the current iteration.
I won’t go into the details of the secant-based acceleration algorithm; instead, I will show the algorithm’s performance for a simple test problem. The following example does not use cut-outs, i.e., acceleration happens on every iteration. Also, the example only uses the one-term and two-term forms of secant-based acceleration. I suspect the three-term variation has a bug, one that I haven’t bothered to track down.
The two DOF model shown below has three springs with bilinear force-deformation relationships. The nodal loads are applied in one step, leading to nodal displacements of U1=2 and U2=5. The convergence criterion is that the norm of the residual force vector decreases by four orders of magnitude.
Consistent Tangent
Using a consistent tangent, which is easy to obtain for this model, the Newton-Raphson algorithm converges in three iterations with the progression of nodal displacements and residual forces shown below.
The Modified Newton algorithm converges in 140 iterations, slowly marching toward the solution.
The one-term secant-based accelerated Newton algorithm converges in five iterations, following more or less the same path as the Newton-Raphson and Modified Newton algorithms.
Going to the two-term secant-based accleration requires four iterations for this analysis.
Inconsistent Tangent
Adding artifical stiffness between springs 2 and 3 to mimic an inconsistent tangent, like what was done in Scott and Fenves (2010) for this same example, the Newton-Raphson algorithm fails to converge. The iterations flip-flop between two states near the solution.
The Modified Newton algorithm is able to find equilibrium after 143 iterations, taking a slightly skewed course compared to its travels with the consistent tangent.
Like the Krylov-subscape accelerated Newton algorithm, the secant-based accelerated algorithm is able to find equilibrium fairly quickly despite the inconsistent tangent. With the one-term acceleration, the search for equilibrium takes ten iterations.
The two-term secant-based acceleration also takes ten iterations, following a slightly different path from the one-term acceleration shown above. Also, more of those ten iterations are near the equilibrium solution.
Summary
Like everything in nonlinear finite element analysis, there is no silver bullet. You cannot declare winners and losers from the results of one simple test problem, especially when the competitors are root-finding algorithms. But give the secant-based acceleration a try, then add the algorithm to the list of options in your smart analyze function.