Harris Quach
  • Home
  • About
  • Updates
  • Research
  • Software
  • Extended Posts

Optimizer

The name given to the algorithm that updates the weights in a model. Note that it does not compute the gradients, and hence is not part of backpropagation.

An optimizer in its most basic form looks like this.

with torch.no_grad(): 
  for p in model.parameters(): p -= p.grad * lr 
  model.zero_grad() 
Back to top

Harris Quach © 2024

  • Github

  • LinkedIn

  • Google Scholar

Version 2.2.1.5 | Feedback | Website made with Quarto