Borrar filtros
Borrar filtros

Regularized logistic regression - Gradient calculation

4 visualizaciones (últimos 30 días)
Nik
Nik el 26 de Jun. de 2016
Respondida: Xinwei LONG el 18 de Feb. de 2020
Hello, I am doing a regularized logistic regression task and stuck with the partial derivatives. The gradient should be normalized (added lambda/m*theta), except for the first term theta(1). So, I had the following code, which works incorrectly:
grad(1) = 1/m*((sigmoid(X(:,1)*theta(1))-y)'*X(:,1));
grad(2:end) = 1/m*((sigmoid(X(:,2:end)*theta(2:end))-y)'*X(:,2:end))' + lambda/m*theta(2:end);
Finally, I came to another solution, which works fine:
grad = (1/m*(sigmoid(X*theta)-y)'*X)';
temp = theta;
temp(1) =0;
grad = grad + lambda/m*temp;
Can someone please explain, why the first option is incorrect. Thanks a lot!

Respuestas (2)

Xinwei LONG
Xinwei LONG el 18 de Feb. de 2020
Hi,
I initially wrote the same form of vectoriztion in dealing with cost function and gradients.
Here's what I found out the right answers:
grad(1)=(1/m)*sum(((sigmoid(X*theta)-y).*X(:,1)),1);
grad(2:end)=(1/m)*sum(((sigmoid(X*theta)-y).*X(:,2:end)),1)'+(lambda/m)*theta(2:end);
Please find the differences in inputs of sigmoid function. The gradient equation for theta_0 and other thetas still required the same input in sigmoid function.

SSV
SSV el 23 de Jul. de 2019
Editada: SSV el 23 de Jul. de 2019
Hi,
I also have the same doubt, did u get the answer ?
BR,
Vignoban

Categorías

Más información sobre Variables en Help Center y File Exchange.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by