A global optimization algorithm, αBB, for twice-differentiable NLPs is presented. It operates within a branch-and-bound framework and requires the construction of a convex lower bound-ing problem. A technique to generate such a valid convex underestimator for arbitrary twice-differentiable functions is described. The αBB has been applied to a variety of problems and a summary of the results obtained is provided.
|Original language||English (US)|
|Journal||Computers and Chemical Engineering|
|State||Published - Jan 1 1996|
All Science Journal Classification (ASJC) codes
- Chemical Engineering(all)
- Computer Science Applications