Neural Networks

Motivations

In todays world it is the state of the art technique for many different machine learning problems.

Non-Linear Hypothesis

Computer Vision:

Neurons and the Brain

The One-Learning-Algorithm Hypothesis:

Model Representation

Neuron Model

Artificial Neural Network (ANN):

Example: If layer 1 has 2 input nodes and layer 2 has 4 activation nodes. What is the dimension of Θ(1) ?

Solution:

Here sj = 2 and sj+1=4, so sj+1 x (s + 1) = 4×3.

Hence, dimension of Θ(1) is going to be 4×3.

Forward Propagation: Vectorized Implementation

Concept:

Notes:


Basic Applications

Here we will see the detailed example showing how a neural network can compute a complex non linear function of the input.

Intuitive Examples

Neural Network: Logical AND

Notes:

Designing XNOR using Neural Networks:

Multiclass Classification


Cost Function

Some New Notations:
Multi-class Classifier Neural Network
Cost Function for Neural Networks

Notes:


Backpropagation Algorithm

In order to use either gradient descent or one of the advance optimization algorithms, we need to write code that takes the input parameters Θ and computes J(Θ) of theta and the partial derivative terms.

Computing partial derivative terms.

Step-1: Forward Propagation - Calculation of activation values

The Vectorized implementation of forward propagation allows us to compute the activation values for all of the neurons in our neural network.

Step-2: Backpropagation - Calculate partial derivatives


← Previous: Regularization