-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathlrCostFunction.m
More file actions
60 lines (48 loc) · 1.92 KB
/
lrCostFunction.m
File metadata and controls
60 lines (48 loc) · 1.92 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
function [J, grad] = lrCostFunction(theta, X, y, lambda)
%LRCOSTFUNCTION Compute cost and gradient for logistic regression with
%regularization
% J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Hint: The computation of the cost function and gradients can be
% efficiently vectorized. For example, consider the computation
%
% sigmoid(X * theta)
%
% Each row of the resulting matrix will contain the value of the
% prediction for that example. You can make use of this to vectorize
% the cost function and gradient computations.
%
% Hint: When computing the gradient of the regularized cost function,
% there're many possible vectorized solutions, but one solution
% looks like:
% grad = (unregularized gradient for logistic regression)
% temp = theta;
% temp(1) = 0; % because we don't add anything for j = 0
% grad = grad + YOUR_CODE_HERE (using the temp variable)
%
z=X*theta;
hypot= sigmoid(z);
%Regularized J
J=(1/m)*(-(y'*log(hypot))-(1-y)'*(log(1-hypot)));
theta_copy=theta(2: size(theta)(1));
J+=lambda/(2*m)*(sum(theta_copy.^2));
%======Vectorized Gradient
grad=(1/m)*X'*(hypot-y);
temp=theta*(lambda/m);
temp(1)=0;
grad= grad+temp;
% =============================================================
grad = grad(:);
end