0% found this document useful (0 votes)
29 views

Deep Learning - Lecture 4

The document discusses feedforward neural networks and learning rules. It explains the basic concepts of neural networks including layers, weights, and biases. It also covers the perceptron learning rule, delta rule, and their implementation in Python. Additionally, it addresses the limitations of single-layer networks for XOR and how multi-layer networks address this using backpropagation.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Deep Learning - Lecture 4

The document discusses feedforward neural networks and learning rules. It explains the basic concepts of neural networks including layers, weights, and biases. It also covers the perceptron learning rule, delta rule, and their implementation in Python. Additionally, it addresses the limitations of single-layer networks for XOR and how multi-layer networks address this using backpropagation.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

SIF910: Deep Learning

Lecture 4: Feedforward Neural Networks (1)

Chandra Kusuma Dewa


2021/2022
Outlines for today
● Basic Concepts and Terminology for Neural Networks
● Representing Networks Components With Vectors and Matrices
● The Perceptron Learning Rule
● The Delta Rule
What is neural networks?

● Deep learning is machine learning


with deep artificial neural networks
● The goal of the lecture today is to
explain how shallow neural
networks work
● We call the shallow neural networks
as simple feedforward neural
networks

Image source: https://victorzhou.com/media/nn-series/network.svg


A simple neural network

● Basically, a neural network consists of


three layers, namely: a) input layer, b)
hidden layer, and c) output layer
● Each layer consists of several neurons
● The network stores weights (w) and
biases (b) to generate outputs from
the given inputs
● During the training, all weights and
biases are updated to minimize the
loss using a specified learning
algorithm
The mechanism inside the neural network

Image source: https://www.jeremyjordan.me/content/images/2018/01/Screen-Shot-2017-11-07-at-12.32.19-PM.png


The Perceptron Learning Rule

To update the weights and biases inside a simple neural network, we can use the
perceptron rule as follows:

Where e = (t - o) which denotes the difference between the label t and output of
the model o. Moreover, p denotes the input of the neural network
The Delta Rule

● This is another way to learn the weights (and bias) of the neural network
during the training process which is formulated as follows

● Where

Source: https://www.youtube.com/watch?v=7VV_fUe6ziw
Implementing The Rules

Suppose that we want to train a neural network for mimicking the logical AND
function as follows

Input Output

X1 X2 AND

1 1 1

1 0 0

0 1 0

0 0 0

Image source: https://www.thomascountz.com/assets/images/linear-seperability.png


Implementing The Rules in Python (1)
Implementing The Rules in Python (2)
Implementing The Rules in Python (3)
What About the XOR Problem?

Can we also use the previous neural network model with the perceptron learning
rule and the delta rule for the XOR problem as follows?

Input Output

X1 X2 XOR

1 1 0

1 0 1

0 1 1

0 0 0

Image source: https://miro.medium.com/max/2000/1*aN7_uKSN8iWUktGOKa1Vgg.png


Why It Failed? How to Fix It?

● The XOR problem is not linearly separable!


● Add the hidden layers to the model → use multilayer perceptron instead
● However, how to update all weights (including the hidden layer)? → using
gradient descent / backpropagation / delta rule

Image source: https://miro.medium.com/max/2000/1*iB0NCmG2OFUGI-DCokHT2g.png

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy