## Hot questions for Using Neural networks in jupyter notebook

Question:

learning how perceptron works and attempted to created a function out of it.

I recently watched a video in youtube as an introduction to the said topic.

Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:

# x1 x2 y data = [ [3.5, 1.5, 1], [2.0, 1.0, 0], [4.0, 1.5, 1], [3.0, 1.0, 0], [3.5, 0.5, 1], [2.0, 0.5, 0], [5.5, 1.0, 1], [1.0, 1.0, 0], [4.5, 1.0, 1] ] data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])

Sigmoid function:

def sigmoid(x): x = 1 / (1 + np.exp(-x)) return x

Perceptron function:

w1 = np.random.randn() w2 = np.random.randn() b = np.random.randn() def perceptron(x1,x2, w1, w2, b): z = (w1 * x1) + (w2 * x2) + b return sigmoid(z)

My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?

def get_cost_slope(b,a): """ b = predicted value a = actual value """ sqrerror = (b - a) ** 2 slope = 2 * (b-a) return sqrerror, slope

Answer:

You need to create a method which would backpropagate through the perceptron and optimize the weights.

def optimize( a , b ): sqrerror = (b - a) ** 2 cost_deriv = 2 * (b-a) sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function learning_rate = 0.001 # Used to scale the gradients w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate b -= ( cost_deriv * sigmoid_deriv ) * learning_rate

Since ,

Where $J$ is the cost function.

Question:

I am training a neural network and a part of my code has returned the following error:

def plot_confusion_matrix(truth, predictions, classes, normalize=False, save=False, cmap=plt.cm.Oranges, path="confusion_matrix.png"): acc = (np.array(truth) == np.array(predictions)) size = float(acc.shape[0]) #error acc = np.sum(acc.astype("int32")) / size (...) AttributeError: 'bool' object has no attribute 'shape'

function call

pred = pred.numpy() plot_confusion_matrix(truth=labels.numpy(), predictions=pred, save=False, path="logref_confusion_matrix.png", classes=["forward", "left", "right"])

Where the thuth represents the labels of Y and predictions the array of prediction, both with shape 32, 3. I checked the update on numpy, ipython etc and all are updated, tried some modification, but without success.

Answer:

The only reason that acc would be a boolean and not a numpy array of booleans is that you are passing in a singular value for truth and predictions. In the code you provided, there would be no error for an actual array of 32x3. Look at the rest of your code and make sure you actually pass in an array to np.array() instead of singular values.