Abstract
A global optimization algorithm, αBB, for twice-differentiable NLPs is presented. It operates within a branch-and-bound framework and requires the construction of a convex lower bound-ing problem. A technique to generate such a valid convex underestimator for arbitrary twice-differentiable functions is described. The αBB has been applied to a variety of problems and a summary of the results obtained is provided.
Original language | English (US) |
---|---|
Pages (from-to) | S419-S424 |
Journal | Computers and Chemical Engineering |
Volume | 20 |
Issue number | SUPPL.1 |
DOIs | |
State | Published - 1996 |
All Science Journal Classification (ASJC) codes
- General Chemical Engineering
- Computer Science Applications