Optimizers

Optimizers

Optimizers provides a way to update the weights of Merlin.Var.

x = zerograd(rand(Float32,5,4))
opt = SGD(0.001)
opt(x)
println(x.grad)
Merlin.AdaGradType.
AdaGrad

AdaGrad Optimizer.

References

source
Merlin.AdamType.
Adam

Adam Optimizer

References

source
Merlin.SGDType.
SGD

Stochastic Gradient Descent Optimizer.

Arguments

  • rate: learning rate

  • [momentum=0.0]: momentum coefficient

  • [nesterov=false]: use nesterov acceleration or not

source