0% found this document useful (0 votes)
43 views

Neural Metwork: Institut Teknologi Sepuluh Nopember (ITS) Surabaya - Indonesia

The document discusses neural networks and provides: 1) Advantages of neural networks including high prediction accuracy, robustness to errors in training data, and ability to output discrete or real-valued predictions. 2) Criticisms of neural networks including long training times and difficulty understanding the learned functions. 3) A brief overview of the history and key concepts of neural networks including units, connections, activation functions, and types of neural networks.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Neural Metwork: Institut Teknologi Sepuluh Nopember (ITS) Surabaya - Indonesia

The document discusses neural networks and provides: 1) Advantages of neural networks including high prediction accuracy, robustness to errors in training data, and ability to output discrete or real-valued predictions. 2) Criticisms of neural networks including long training times and difficulty understanding the learned functions. 3) A brief overview of the history and key concepts of neural networks including units, connections, activation functions, and types of neural networks.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

NEURAL

METWORK Kartika Fithriasari/Data Mining

INSTITUT TEKNOLOGI SEPULUH NOPEMBER


(ITS)
www.its.ac.id
Surabaya - Indonesia
1
www.its.ac.id
2
Neural Networks
• Advantages
• prediction accuracy is generally high
• robust, works when training examples contain errors
• output may be discrete, real-valued, or a vector of several discrete or real-
valued attributes
• fast evaluation of the learned target function
• Criticism
• long training time
• difficult to understand the learned function (weights)
• not easy to incorporate domain knowledge

Kartika F/Data MIningMining www.its.ac.id


3
Kartika F/Data MIningMining www.its.ac.id
4
Kartika F/Data MIningMining www.its.ac.id
5
NEURAL NETWORK

Kartika F/Data MIningMining www.its.ac.id


6
What is Neural Network
• A neural network is defined as a computing system that consist of a number
of simple but highly interconnected elements or nodes, called ‘neurons’,
which are organized in layers which process information using dynamic state
responses to external inputs.

Kartika F/Data MIningMining www.its.ac.id


7
What is Neural Network
• A neural network is a branch of machine learning (a.k.a artificial intelligence)
called deep learning . Deep learning is one of many machine learning
algorithms to enable a computer perform a plethora of tasks such as :
• Some kind of prediction like stock prediction , house rent prediction or whether a person is
eligible for a job or not etc.)
• Classification ( like classifying images of dogs , cats and hotdogs or whether an employee is
good or bad for the company etc.).

Kartika F/Data MIningMining www.its.ac.id


8
History of Neural Nets
• Warren McCulloch and Walter Pitts created the first model of an NN
in 1943.
• in 1958, Frank Rosenblatt created the first ever model that could do
pattern recognition.
• The first NNs that could be tested and had many layers were
published by Alexey Ivakhnenko and Lapa in 1965.
• in 1975 Paul Werbos came up with Back-propagation, which solved
the XORproblem and in general made NN learning more efficient.
• In 2011, Deep NNs started incorporating convolutional layers with
max-pooling layers whose output was then passed to several fully
connected layers which were followed by an output layer. These are
called Convolutional Neural Networks.
Kartika F/Data MIningMining www.its.ac.id
9
Important things of Neural Network
• Units / Neurons.
• Connections / Weights / Parameters.
• Biases.
• Activation Functions
Activation
functions
Types of nn
JARGON
• Although many NN models are similar or identical to well-known
statistical models, the terminology in the NN literature is quite
different from that in statistics
➢variables are called features
➢independent variables are called inputs
➢predicted values are called outputs
➢dependent variables are called targets or training values
➢residuals are called errors
➢estimation is called training, learning, adaptation, or self-
organization.
➢an estimation criterion is called an error function, cost function, or Lyapunov
function
➢observations are called patterns or training pairs
➢parameter estimates are called (synaptic) weights
➢interactions are called higher-order neurons
➢transformations are called functional links
➢regression and discriminant analysis are called supervised learning or
heteroassociation
➢data reduction is called unsupervised learning, encoding, or autoassociation
➢cluster analysis is called competitive learning or adaptive vector quantization
➢interpolation and extrapolation are called generalization
Perceptrons

• Single-layer Neural
Networks
• Perceptron will learn to
classify any linearly
separable set of inputs
Learning in Perceptrons
• Rosenblatt 1960
• Let y be the correct output, and f(x) the output function of the
network.
• The neuron's output, 0 or 1, is determined by whether the
weighted sum σ𝑗 𝑤𝑗 𝑥𝑗 is less than or greater than threshold value

• Error: E = y-f(x)
• Update weights: 𝑤𝑗 ← 𝑤𝑗 + 𝛼𝑥𝑗 𝐸
Example: XOR
MULTILAYER PERCEPTRON
Arsitektur 1 b1
1 bot
y1
X1 w11
Z1
w1p
X2 w21

... w2j w1j

w2p
Zj yj
1 wn1
wnj
1
Xn wnp
Zp ym
bp
1

1
Algoritma Pembelajaran BP
• Langkah 0 : Inisialisasi nilai bobot dan bias
tentukan  dengan nilai yang cukup kecil
• Langkah 1 : Jika kondisi penghentian loop tidak terpenuhi, kerjakan langkah 2-9
• Langkah 2 : Setiap pasangan training s,t kerjakan langkah 3-8
• Langkah 3 : Isikan input, setiap input akan diterima layer berikutnya
• Langkah 4 : Hitung zin j = v0j+xivij, zout=f(zin)
• Langkah 5 : Hitung yin = w0k+ziwjk, yout=f(yin)
• Langkah 6 : k = (tk – yout) f‘(yin k), wjk=kzj
• Langkah 7 : in j =  k wjk , k = in j f’(zin j), vij=jxi,
• Langkah 8 : wjk(new)=wjk(old)+wjk , vij(new)= vij(old)+ vij
• Langkah 9 : Cek kondisi pemberhentian,
A Step by Step Backpropagation
• The basic structure:

The goal of backpropagation is to


optimize the weights so that the
neural network can learn how to
correctly map arbitrary inputs to
outputs.
Example
• The initial weights, the biases, and training inputs/outputs:

https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/
The Forward Pass

• Goal of this step is


calculate The total
error for the neural
network
• To begin, lets see
what the neural
network currently
predicts given the
weights and biases
above and inputs
of 0.05 and 0.10.
To do this we’ll
feed those inputs
forward though
the network.
• For example, the target output for oi is 0.01 but the neural network
output 0.75136507, therefore its error is:
The Backwards Pass
Output Layer
• Our goal with
backpropagation is
to update each of
the weights in the
network so that
they cause the
actual output to be
closer the target
output, thereby
minimizing the
error for each
output neuron and
the network as a
whole.
• Hidden Layer
• Finally, we’ve updated all of our weights! When we fed forward the
0.05 and 0.1 inputs originally, the error on the network was
0.298371109. After this first round of backpropagation, the total error
is now down to 0.291027924. It might not seem like much, but after
repeating this process 10,000 times, for example, the error plummets
to 0.0000351085. At this point, when we feed forward 0.05 and 0.1,
the two outputs neurons generate 0.015912196 (vs 0.01 target) and
0.984065734 (vs 0.99 target).
Neural Networks
• Advantages
• prediction accuracy is generally high
• robust, works when training examples contain errors
• output may be discrete, real-valued, or a vector of several
discrete or real-valued attributes
• fast evaluation of the learned target function
• Criticism
• long training time
• difficult to understand the learned function (weights)

April 20, 2020 Data Mining: Concepts and Techniques 48

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy