PyTorch find minimum of a custom function with an optimiser (Adam)

March 2022

I caught myself thinking, that most of the tutorial on PyTorch are about neural networks, meanwhile it’s a quite general optimisation framework. There’s a tutorial about how to use autograd, but, still, using autograd is not the same as using an already written high-quality optimiser like, Adam, Adagrad, etc.

So I decided to start with a minimum example and find minimum of x^2 + 1. Weird, but I have not found many tutorials and got stuck with that simple problem. Conor Mc wrote an article, but, still, it uses some custom class based on nn.Model. There also was an article by Bijay Kumar, yet, still, it used nn.Linear layer! 🙂 So, yeah, it took me some time to figure out a working solution and here it is:

from matplotlib.pyplot import *
from torch.optim import Adam
from torch import Tensor
from torch.nn import Parameter

X = Parameter(Tensor([10]))

opt = Adam([X], lr=1)
losses = []
for i_step in range(10):
    y = X ** 2 + 1
    opt.zero_grad()
    y.backward()
    opt.step()
    losses.append(y.item())

plot(losses)
show()

And here’s the output.

It inspired me to make a bit more of visualisations or the function and of the loss with different parameters.

The complete code of this visualisation is available for download:

Leave a Reply

Your email address will not be published. Required fields are marked *