Understanding of Gradient Descent algorithm is important for anyone who wants to gain deeper knowledge on how machine algorithms works.
4th step (partial diff. Of loss function) is done by back propagation right?
It is not done by backpropagation. It is the other way. Gradient descent is used in backpropagation.
Please explain hyperparameter vs optimization techniques for all models
There is no comparison between hyperparameters and optimization techniques. Actually, optimization techniques are used to optimize hyperparameters. Could you please elaborate your ask a little bit more? Didn't get you!
4th step (partial diff. Of loss function) is done by back propagation right?
It is not done by backpropagation. It is the other way. Gradient descent is used in backpropagation.
Please explain hyperparameter vs optimization techniques for all models
There is no comparison between hyperparameters and optimization techniques. Actually, optimization techniques are used to optimize hyperparameters. Could you please elaborate your ask a little bit more? Didn't get you!