Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions Assets/Scripts/Neural Network/Activation/Activation.cs
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,13 @@ public double Activate(double[] inputs, int index)
return res;
}

// Much like stochastic gradient descent's quick, not-so-perfect steps downhill
// using mini-batches, this derivative serves as a good-enough approximation. It's
// simpler to calculate and we can keep our interface consistent with the other
// single-input activation functions above (though it won't pass gradient checks). A
// complete derivative for the Softmax function involves computing a Jacobian matrix
// which finds partial derivative of the activation function with respect to all of
// the inputs of each node in the layer.
public double Derivative(double[] inputs, int index)
{
double expSum = 0;
Expand Down