SPN
doc
Network
mlbase.layers
Cost function
Optimizer
Regularization
Utility
SPN
Docs
»
Optimizer
Edit on GitHub
Optimizer
¶
class
mlbase.gradient_optimizer.
GradientOptimizer
(
lr
)
[source]
¶
class
mlbase.gradient_optimizer.
RMSprop
(
lr=0.01
,
rho=0.9
,
epsilon=1e-06
)
[source]
¶
class
mlbase.gradient_optimizer.
Adam
(
lr=0.01
,
beta1=0.9
,
beta2=0.999
,
epsilon=1e-07
)
[source]
¶
class
mlbase.gradient_optimizer.
Momentum
(
lr=0.01
,
mu=0.5
)
[source]
¶
class
mlbase.gradient_optimizer.
Nesterov
(
lr=0.01
,
mu=0.5
)
[source]
¶
class
mlbase.gradient_optimizer.
Adagrad
(
lr=0.01
,
epsilon=1e-07
)
[source]
¶
Read the Docs
v: doc
Versions
latest
doc
Downloads
pdf
htmlzip
epub
On Read the Docs
Project Home
Builds
Free document hosting provided by
Read the Docs
.