minimize or
  maximize
  a scalar function, called the
  objective
  function.See: Description
| Interface | Description | 
|---|---|
| ConvergenceChecker<P> | This interface specifies how to check if an optimization algorithm has
 converged. | 
| OptimizationData | Marker interface. | 
| OptimizationProblem<P> | Common settings for all optimization problems. | 
| Class | Description | 
|---|---|
| AbstractConvergenceChecker<P> | Base class for all convergence checker implementations. | 
| AbstractOptimizationProblem<P> | Base class for implementing optimization problems. | 
| BaseMultiStartMultivariateOptimizer<P> | Base class multi-start optimizer for a multivariate function. | 
| BaseMultivariateOptimizer<P> | Base class for implementing optimizers for multivariate functions. | 
| BaseOptimizer<P> | Base class for implementing optimizers. | 
| InitialGuess | Starting point (first guess) of the optimization procedure. | 
| MaxEval | Maximum number of evaluations of the function to be optimized. | 
| MaxIter | Maximum number of iterations performed by an (iterative) algorithm. | 
| PointValuePair | This class holds a point and the value of an objective function at
 that point. | 
| PointVectorValuePair | This class holds a point and the vectorial value of an objective function at
 that point. | 
| SimpleBounds | Simple optimization constraints: lower and upper bounds. | 
| SimplePointChecker<P extends Pair<double[],? extends Object>> | Simple implementation of the  ConvergenceCheckerinterface using
 only point coordinates. | 
| SimpleValueChecker | Simple implementation of the  ConvergenceCheckerinterface using
 only objective function values. | 
| SimpleVectorValueChecker | Simple implementation of the  ConvergenceCheckerinterface using
 only objective function values. | 
| Enum | Description | 
|---|---|
| LocalizedOptimFormats | Enumeration for localized messages formats used in exceptions messages. | 
  Generally, optimizers are algorithms that will either
  minimize or
  maximize
  a scalar function, called the
  objective
  function.
  
  For some scalar objective functions the gradient can be computed (analytically
  or numerically). Algorithms that use this knowledge are defined in the
  org.hipparchus.optim.nonlinear.scalar.gradient package.
  The algorithms that do not need this additional information are located in
  the org.hipparchus.optim.nonlinear.scalar.noderiv package.
 
Some problems are solved more efficiently by algorithms that, instead of an objective function, need access to all the observations. Such methods are implemented in the fitting module.
  This package provides common functionality for the optimization algorithms.
  Abstract classes (BaseOptimizer and
  BaseMultivariateOptimizer) contain
  boiler-plate code for storing evaluations and iterations
  counters and a user-defined
  convergence checker.
 
For each of the optimizer types, there is a special implementation that wraps an optimizer instance and provides a "multi-start" feature: it calls the underlying optimizer several times with different starting points and returns the best optimum found, or all optima if so desired. This could be useful to avoid being trapped in a local extremum.
Copyright © 2016–2020 Hipparchus.org. All rights reserved.