Package | Description |
---|---|
org.bytedeco.caffe |
Modifier and Type | Class and Description |
---|---|
class |
FloatAdaDeltaSolver |
class |
FloatAdaGradSolver |
class |
FloatAdamSolver
\brief AdamSolver, an algorithm for first-order gradient-based optimization
of stochastic objective functions, based on adaptive estimates of
lower-order moments.
|
class |
FloatNesterovSolver |
class |
FloatRMSPropSolver |
class |
FloatSGDSolver
\brief Optimizes the parameters of a Net using
stochastic gradient descent (SGD) with momentum.
|
Modifier and Type | Method and Description |
---|---|
FloatSolver |
FloatSolverRegisterer.Creator_SolverParameter.call(SolverParameter arg0) |
FloatSolver |
FloatSolverRegistry.Creator.call(SolverParameter arg0) |
static FloatSolver |
FloatSolverRegistry.CreateSolver(SolverParameter param) |
Copyright © 2022. All rights reserved.