Package org.hipparchus.optim

Generally, optimizers are algorithms that will either minimize or maximize a scalar function, called the objective function.
For some scalar objective functions the gradient can be computed (analytically or numerically). Algorithms that use this knowledge are defined in the org.hipparchus.optim.nonlinear.scalar.gradient package. The algorithms that do not need this additional information are located in the org.hipparchus.optim.nonlinear.scalar.noderiv package.

Some problems are solved more efficiently by algorithms that, instead of an objective function, need access to all the observations. Such methods are implemented in the fitting module.

This package provides common functionality for the optimization algorithms. Abstract classes (BaseOptimizer and BaseMultivariateOptimizer) contain boiler-plate code for storing evaluations and iterations counters and a user-defined convergence checker.

For each of the optimizer types, there is a special implementation that wraps an optimizer instance and provides a "multi-start" feature: it calls the underlying optimizer several times with different starting points and returns the best optimum found, or all optima if so desired. This could be useful to avoid being trapped in a local extremum.

Skip navigation links

Copyright © 2016–2018 Hipparchus.org. All rights reserved.