Machine Learning Using Neural Networks: Presentation By: C. Vinoth Kumar SSN College of Engineering
Machine Learning Using Neural Networks: Presentation By: C. Vinoth Kumar SSN College of Engineering
Networks
Presentation by:
C. Vinoth Kumar
SSN College of Engineering
Physics
SA
EC
Computational
Intelligence
ANN
Dtree
Machine
Learning
The Family Tree
Computational Intelligence
Supervised learning
– learning with a teacher
Unsupervised learning
– Learning from pattern
Reinforcement learning
– Learning through experiences
Machine Learning
Synapse
Synapse Dendrites
Axon
Axon
Axon hillock
Soma Soma
Dendrites
Synapse
Biological neural network
Middle Layer
Input Layer Output Layer
Analogy between biological and
artificial neural networks
x1
Y
w1
x2
w2
Neuron Y Y
wn
Y
xn
Neuron - A simple computing element
The neuron computes the weighted sum of the input
signals and compares the result with a threshold value, θ.
If the net input is less than the threshold, the neuron
output is –1. But if the net input is greater than or equal to
the threshold, the neuron becomes activated and its
output attains a value +1.
Y Y Y Y
+1 +1 +1 +1
0 X 0 X 0 X 0 X
-1 -1 -1 -1
Inputs
x1 Linear Hard
w1 Combiner Limiter
Output
∑ Y
w2
θ
x2
Threshold
The aim of the perceptron is to classify inputs,
x1, x2, . . ., xn, into one of two classes, say
A1 and A2.
Class A 1
1
2
1
x1
Class A 2 x1
x1w 1 + x2w 2 − θ = 0 x1 w 1 + x 2 w 2 + x 3 w 3 − θ = 0
x3
(a) Two-input perceptron. (b) Three-input perceptron.
How does the perceptron learn its classification tasks?
This is done by making small adjustments in the
weights to reduce the difference between the actual and
desired outputs of the perceptron. The initial weights
are randomly assigned, usually in the range [−0.5, 0.5],
and then updated to obtain the output consistent with
the training examples.
wi ( p + 1) = wi ( p) + α ⋅ xi ( p) ⋅ e( p)
where p = 1, 2, 3, . . .
α is the learning rate, a positive constant less
than unity.