My Coding >
Programming language >
Python >
PyTocrch >
PyTorch  Calculate few steps of gradient
PyTorch  Calculate few steps of gradientCalculate few steps of gradient descent with PyTorch Approaching to the local minima (or any local extremum) required few steps towards gradient. Now we will make this tensor optimisation by two different ways Our function for example will be: f(w)=_{i,j}∏ln(ln(w_{i,j}+7)) where initial tensor w = [[5., 10.], [1., 2.]] and We will perform n=500 steps Optimisation with fixed stepFor the fixed step we will take gradient step alpha (start learning rate) = 0.001
Optimisation with optimal stepFor real tasks, it is important to have proper steps towards gradient descent. PyTorch can calculate optimal value of this step. In this task we will perform 500 steps of optimisation with calculating of an optimal step size for each cycle. All other parameters are the same

Last 10 artitles
9 popular artitles


© 2020 MyCoding.uk My blog about coding and further learning. This blog was writen with pure Perl and frontend output was performed with TemplateToolkit. 