function [W1, W2] = BackpropCE(W1, W2, X, D) %Backpropagation with Cross Entropy cost function. alpha = 0.9; N = size(X,1); for k = 1:N x = X(k, :)'; % x = a column vector; input sample d = D(k); %desired output v1 = W1*x; y1 = Sigmoid(v1); v = W2*y1; y = Sigmoid(v); %output of the neuron in the output layer. e = d - y; delta = e; e1 = W2'*delta; delta1 = y1.*(1-y1).*e1; dW1 = alpha*delta1*x'; W1 = W1 + dW1; dW2 = alpha*delta*y1'; W2 = W2 + dW2; end end