Limit this search to....

Conjugate Gradient Algorithms in Nonconvex Optimization 2009 Edition
Contributor(s): Pytlak, Radoslaw (Author)
ISBN: 3540856331     ISBN-13: 9783540856337
Publisher: Springer
OUR PRICE:   $161.49  
Product Type: Hardcover - Other Formats
Published: December 2008
Qty:
Temporarily out of stock - Will ship within 2 to 5 weeks
Annotation: Conjugate direction methods were proposed in the early 1950s. With the development of our powerful computers, attempts were made to lie foundations for mathematical aspects of computations which could take advantage of these computers. This monograph gives an overview of the standard conjugate gradient algorithms, but it goes beyond the treatment of techniques and it can therefore be regarded as an extension to the methods originally proposed. The book draws much attention to the preconditioned versions of the method and since limited memory quasi-Newton algorithms are preconditioned conjugate gradient algorithms when apply to quadratics, these variable metric techniques are also discussed.
Additional Information
BISAC Categories:
- Mathematics | Linear & Nonlinear Programming
- Technology & Engineering | Industrial Engineering
- Business & Economics | Operations Research
Dewey: 519.76
Series: Nonconvex Optimization and Its Applications
Physical Information: 1.2" H x 6.5" W x 9.4" (1.95 lbs) 478 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
Conjugate direction methods were proposed in the early 1950s. When high speed digital computing machines were developed, attempts were made to lay the fo- dations for the mathematical aspects of computations which could take advantage of the ef?ciency of digital computers. The National Bureau of Standards sponsored the Institute for Numerical Analysis, which was established at the University of California in Los Angeles. A seminar held there on numerical methods for linear equationswasattendedbyMagnusHestenes, EduardStiefel andCorneliusLanczos. This led to the ?rst communication between Lanczos and Hestenes (researchers of the NBS) and Stiefel (of the ETH in Zurich) on the conjugate direction algorithm. The method is attributed to Hestenes and Stiefel who published their joint paper in 1952 101] in which they presented both the method of conjugate gradient and the conjugate direction methods including conjugate Gram-Schmidt processes. A closelyrelatedalgorithmwasproposedbyLanczos 114]whoworkedonalgorithms for determiningeigenvalues of a matrix. His iterative algorithm yields the similarity transformation of a matrix into the tridiagonal form from which eigenvalues can be well approximated.Thethree-termrecurrencerelationofthe Lanczosprocedurecan be obtained by eliminating a vector from the conjugate direction algorithm scheme. Initially the conjugate gradient algorithm was called the Hestenes-Stiefel-Lanczos method 86].