OptParams
OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.
Configurations:
- useStochastic=false,useL1=false: LBFGS with L2 regularization
- useStochastic=false,useL1=true: OWLQN with L1 regularization
- useStochastic=true,useL1=false: AdaptiveGradientDescent with L2 regularization
- useStochastic=true,useL1=true: AdaptiveGradientDescent with L1 regularization
- Value parameters:
- alpha
rate of change to use, only applies to SGD.
- batchSize
size of batches to use if useStochastic and you give a BatchDiffFunction
- maxIterations,
how many iterations to do.
- regularization
regularization constant to use.
- tolerance
convergence tolerance, looking at both average improvement and the norm of the gradient.
- useL1
if true, use L1 regularization. Otherwise, use L2.
- useStochastic
if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.
Value members
Concrete methods
def iterations[T](f: BatchDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): Iterator[State]
def iterations[T](f: StochasticDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): Iterator[State[T, _, _]]
def iterations[T, K](f: DiffFunction[T], init: T)(implicit space: MutableEnumeratedCoordinateField[T, K, Double]): Iterator[State]
def minimize[T](f: BatchDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): T
def minimize[T](f: DiffFunction[T], init: T)(implicit space: MutableEnumeratedCoordinateField[T, _, Double]): T