Artificial Neural Networks

Published on November 2016 | Categories: Documents | Downloads: 48 | Comments: 0 | Views: 595
of 55
Download PDF   Embed   Report

ARTIFICIAL NEURAL NETOWRKSNEURONCLASSIFICATION OF NEURAL NETWORK

Comments

Content


Ar t if icial Neural Net works for process cont rol
Puneet Kr Singh
Mtech ( FT)
1
st
Y r
P K Singh, F O E, D E I
http://pksingh.webstarts.com/student_community.html
What is a Neural Network?
•Biologically motivated approach to
machine learning
Modern digital computers outperform human in the
domain of numeric computation & related symbol
manipulation
However humans can effortlessly solve complex
perceptual problems….
like Recognizing a man in a crowd from a mere
glimpse of his face at such a high speed & extent as to
dwarf the world’s fastest computers
P K Singh, F O E, D E I
ELECTRON MICROGRAPH OF A REAL NEURON
P K Singh, F O E, D E I
NN as an model of
br ai n-l i ke Comput er
 An ar t i f i ci al neur al net w or k (ANN) i s a
massi vel y par al l el di st r i but ed pr ocessor t hat
has a nat ural propensi t y for st or i ng
exper i ment al know l edge and maki ng i t
avai l abl e f or use. It means t hat :
Know l edge i s acqui r ed by t he net w or k
t hr ough a l ear ni ng (t r ai ni ng) pr ocess;
ANN as a Brain-Like Computer
t hr ough a l ear ni ng (t r ai ni ng) pr ocess;
The st r engt h of t he i nt er connect i ons
bet w een neur ons i s i mpl ement ed by means
of t he synapt i c w ei ght s used t o st or e t he
know l edge.
The l ear ni ng process i s a pr ocedur e of t he
adapt i ng t he w ei ght s w i t h a l ear ni ng
al gor i t hm i n or der t o capt ure t he know l edge.
On mor e mat hemat i cal l y, t he ai m of t he
l ear ni ng process i s t o map a gi ven rel at i on
bet w een i nput s and out put (out put s) of t he
net w or k.
Br ai n
The human br ai n i s st i l l not w el l
under st ood and i ndeed i t s
behavi or i s ver y compl ex!
Ther e ar e about 10 bi l l i on
neur ons i n t he human cor t ex and
60 t r i l l i on synapses of connect i ons
The br ai n i s a hi ghl y compl ex,
nonl i near and par al l el comput er
(i nf or mat i on-pr ocessi ng syst em)
P K Singh, F O E, D E I
P K Singh, F O E, D E I
A Neuron
1
x
n
x
1
( ,..., )
n
x f x
.
.
.
φ(z)
0 1 1
...
n n
z w wx w x = + + +
1 0 1 1
( ,..., ) ( ... )
n n n
f x x F w w x w x = + + +
Where f is a function to be earned.
are the inputs.
φ is the activation function.
n
x
0 1 1
...
n n
z w wx w x = + + +
1
,...,
n
x x
Z i s t he w ei ght ed sum
P K Singh, F O E, D E I
( )
z z  =
Li near act i vat i on Logi st i c act i vat i on
( )
1
1
z
z
e


÷
=
+
z
z
1
0
Σ

Ar t if icial Neuron:
Classical Act ivat ion Funct ions
Thr eshol d act i vat i on
Hyper bol i c t angent act i vat i on
( ) ( )
u
u
e
e
u tanh u


 
2
2
1
1
÷
÷
+
÷
= =
( )
1, 0,
sign( )
1, 0.
if z
z z
if z

> ¦
= =
´
÷ <
¹
z
z
z
-1
1
0
0
Σ
1
-1
P K Singh, F O E, D E I
Neural Net work
 Neural Network learns by adjusting the weights so as to be able to
correctly classify the training data and hence, after testing phase,
to classify unknown data.
 Neural Network needs long time for training.  Neural Network needs long time for training.
 Neural Network has a high tolerance to noisy and incomplete data
P K Singh, F O E, D E I
Learning
 The procedure that consists in estimating the parameters of neurons (setting up
the weights) so that the whole network can perform a specific task.
 2 types of learning
 Supervised learning
 Unsupervised learning
 Supervised learning which incorporates an external teacher, so that each output  Supervised learning which incorporates an external teacher, so that each output
unit is told what its desired response to input signals ought to be.
 Unsupervised learning uses no external teacher and is based upon only local
information. It is also referred to as self-organization, in the sense that it self-
organizes data presented to the network and detects their emergent collective
properties.
P K Singh, F O E, D E I
Threshold Neuron (Percept ron)
• Output of a threshold neuron is binary, while inputs may be either
binary or continuous
• If inputs are binary, a threshold neuron implements a Boolean
function
• The Boolean alphabet {1, -1} is usually used in neural networks • The Boolean alphabet {1, -1} is usually used in neural networks
theory instead of {0, 1}.
• Correspondence with the classical Boolean alphabet {0, 1} is
established as follows:
1 2 ( 0 1 {0 1) {1 1} } 1 1 1
y
; ; , x= - y - ,- y , x e = e ¬ ÷ ÷ ÷
P K Singh, F O E, D E I
Threshold Boolean Funct ions: Geomet rical
Int erpret at ion
“ OR” (Di sjunct ion) is an example of t he
t hreshold (linearly separable) Boolean f unct ion:
“ -1s” are separat ed f rom “ 1” by a line
XOR is an example of t he non-t hreshold (not li nearly
separable) Boolean f unct ion: it is impossible
separat e “ 1s” f rom “ -1s” by any singl e li ne
(-1, 1) (1, 1)



(-1, 1) (1, 1)



• 1 1 1
• 1 -1-1
• -1 1-1
• -1 -1-1
• 1 1  1
• 1 -1 -1
• -1 1 -1
• -1 -1 1
(-1,-1) (1,-1)



(-1,-1) (1,-1)

P K Singh, F O E, D E I
Threshold Neuron: Learning
 A main property of a neuron and of a neural network is their
ability to learn from its environment, and to improve its
performance through learning.
 A neuron (a neural network) learns about its environment through A neuron (a neural network) learns about its environment through
an iterative process of adjustments applied to its synaptic weights.
 Ideally, a network (a single neuron) becomes more knowledgeable
about its environment after each iteration of the learning process.
P K Singh, F O E, D E I
Threshold Neuron: Learning
 Let T be a desired output of a neuron (of a network) for a certain
input vector and
 Y be an actual output of a neuron.
 If T=Y, there is nothing to learn.  If T=Y, there is nothing to learn.
 If T≠Y, then a neuron has to learn, in order to ensure that after
adjustment of the weights, its actual output will coincide with a
desired output
P K Singh, F O E, D E I
Error-Correct ion Learning
 If T≠Y , then is the error .
 A goal of learning is to adjust the weights in such a way that for a new
actual output we will have the following:
 That is, the updated actual output must coincide with the desired
output.
 The error-correction learning rule determines how the weights must
T Y  = ÷
Y T Y = + c =

The error-correction learning rule determines how the weights must
be adjusted to ensure that the updated actual output will coincide with
the desired output:
 α is a learning rate (should be equal to 1 for the threshold neuron,
when a function to be learned is Boolean)
( ) ( )
0
0 1 1
0
, , ..., ; , ...,
; 1, ...,
n n
i i i
W w w w X
w
w w
x
x n
w
x
i




= =
= +
= + =


P K Singh, F O E, D E I
A Simplest Net work
1
x Neuron 1
Neuron 3
2
x Neuron 2
P K Singh, F O E, D E I
Solving XOR problem using t he simplest net work
1
x N1
N3 1
-3
) , ( ) , (
2 1 2 2 1 1 2 1 2 1 2 1
x x f x x f x x x x x x v = v = ©
2
x N2
N3
3
1
-3
3
3
-1
-1
3
3
P K Singh, F O E, D E I
Solving XOR problem using t he simplest net work
#
Inputs
Neuron 1 Neuron 2 Neuron 3
XOR=
Z Z Z
2 1
x x © =
) 3 , 3 , 1 (
~
÷ = W ) 1 , 3 , 3 (
~
÷ = W ) 3 , 3 , 1 (
~
÷ = W
x
x
) ( sign z ) ( sign z ) ( sign z
Z
output
Z
output
Z
output
1) 1 1 1 1 5 1 5 1 1
2) 1 -1 -5 -1 7 1 -1 -1 -1
3) -1 1 7 1 -1 -1 -1 -1 -1
4) -1 -1 1 1 1 1 5 1 1
2 1
x x © =
1
x
x
2
) ( sign z ) ( sign z ) ( sign z
P K Singh, F O E, D E I
Neural Net works
 Components – biological plausibility
 Neurone / node
 Synapse / weight
 Feed forward networks
 Unidirectional flow of information
 Good at extracting patterns,
generalisation and prediction generalisation and prediction
 Distributed representation of data
 Parallel processing of data
 Training: Backpropagation
 Not exact models, but good at
demonstrating principles
 Recurrent networks
 Multidirectional flow of information
 Memory / sense of time
 Complex temporal dynamics (e.g. CPGs)
 Various training methods (Hebbian, evolution)
 Often better biological models than FFNs
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
BACK PROPAGATION
 Back Propagat ion learns by it erat ively processing a set of t raining dat a
(samples).
 For each sample, weight s are modified t o minimiz e t he error bet ween net work’s
classificat ion and act ual classificat ion.
P K Singh, F O E, D E I
St eps in Back propagat ion Algorit hm
 STEP ONE: initialize the weights and biases.
 The weights in the network are initialized to random numbers
from the interval [-1,1].
 Each unit has a BIAS associated with it
 The biases are similarly initialized to random numbers from the
interval [-1,1].
 STEP TWO: feed the training sample.
P K Singh, F O E, D E I
St eps in Back propagat ion Algorit hm
( cont ..)
 STEP THREE: Propagate the inputs forward; we compute the net
input and output of each unit in the hidden and output layers.
 STEP FOUR: back propagate the error.  STEP FOUR: back propagate the error.
 STEP FIVE: update weights and biases to reflect the propagated
errors.
 STEP SIX: terminating conditions.
P K Singh, F O E, D E I
Output nodes
Output vector
) )( 1 ( k k
k k k
O T O O Err ÷ ÷ =
jk
k
k j j j
w Err O O Err
¯
÷ = ) 1 (
I
j
O
÷
+
=
1
Back propagat ion Formula
Input nodes
Hidden nodes
Input vector: x
i
w
ij
¯
+ =
i
j i ij j
O w I 
k
i j ij ij
O Err l w w ) ( + =
j j j
Err l) ( + = 
j
I
j
e
O
÷
+
=
1
P K Singh, F O E, D E I
Example of Back propagat ion
Initialize weights :
Input = 3, Hidden
Neuron = 2 Output =1
Random Numbers
from -1.0 to 1.0
x1 x2 x3 w14 w15 w24 w25 w34 w35 w46 w56
1 0 1 0.2 -0.3 0.4 0.1 -0.5 0.2 -0.3 -0.2
Initial Input and weight
P K Singh, F O E, D E I
Example ( cont .. )
 Bias added to Hidden
 + Output nodes
 Initialize Bias
 Random Values from
 -1.0 to 1.0  -1.0 to 1.0
 Bias ( Random )
θ4 θ5 θ6
-0.4 0.2 0.1
Example: Voice Recognit ion
 Task: Learn to discriminate between two different voices
saying “Hello”
 Data
 Sources  Sources
 Steve Simpson
 David Raubenheimer
 Format
 Frequency distribution (60 bins)
 Analogy: cochlea
P K Singh, F O E, D E I
 Network architecture
 Feed forward network
 60 input (one for each frequency bin)
 6 hidden
 2 output (0-1 for “Steve”, 1-0 for “David”)
P K Singh, F O E, D E I
 Presenting the data
Steve
David
P K Singh, F O E, D E I
 Presenting the data (untrained network)
Steve
0.43
0.26
David
0.73
0.55
P K Singh, F O E, D E I
 Calculate error
Steve
0.43 – 0 = 0.43
0.26 –1 = 0.74
David
0.73 – 1 = 0.27
0.55 – 0 = 0.55
P K Singh, F O E, D E I
 Backprop error and adjust weights
Steve
0.43 – 0 = 0.43
0.26 – 1 = 0.74
1.17
David
0.73 – 1 = 0.27
0.55 – 0 = 0.55
1.17
0.82
P K Singh, F O E, D E I
 Presenting the data (trained network)
Steve
0.01
0.99
David
0.99
0.01
P K Singh, F O E, D E I
 Results –Voice Recognition
 Performance of trained network
 Discrimination accuracy between known “Hello”s
 100%
 Discrimination accuracy between new “Hello”’s
 100%
P K Singh, F O E, D E I
Neural Net work as Funct ion Approximat ion
P K Singh, F O E, D E I
St abilizing Cont roller
 This scheme has been applied to the control of robot arm trajectory, where a
proportional controller with gain was used as the stabilizing feedback
controller.
 We can see that the total input that enters the plant is the sum of the
feedback control signal and the feed-forward control signal, which is
calculated from the inverse dynamics model (neural network). calculated from the inverse dynamics model (neural network).
 That model uses the desired trajectory as the input and the feedback control
as an error signal. As the NN training advances, that input will converge to
zero.
 The neural network controller will learn to take over from the feedback
controller. The advantage of this architecture is that we can start with a stable
system, even though the neural network has not been adequately trained.
P K Singh, F O E, D E I
St abilizing Cont roller
P K Singh, F O E, D E I
Image Recognit ion:
Decision Rule and Classif ier
 Is it possible to formulate (and formalize!) the decision rule, using
which we can classify or recognize our objects basing on the
selected features?
 Can you propose the rule using which we can definitely decide is
it a tiger or a rabbit?
P K Singh, F O E, D E I
Image Recognit ion: Decision Rule and classif ier
 Once we know our decision rule, it is not difficult to develop a classifier,
which will perform classification/recognition using the selected features
and the decision rule.
 However, if the decision rule can not be formulated and formalized, we
should use a classifier, which can develop the rule from the learning process should use a classifier, which can develop the rule from the learning process
 In the most of recognition/classification problems, the formalization of the
decision rule is very complicated or impossible at all.
 A neural network is a tool, which can accumulate knowledge from the
learning process.
 After the learning process, a neural network is able to approximate a
function, which is supposed to be our decision rule
P K Singh, F O E, D E I
Why neural net work?
1
( ,..., )
n
f x x - unknow n mul t i -f act or deci si on r ul e
Lear ni ng pr ocess usi ng a r epresent at i ve l ear ni ng set
0 1
( , ,..., )
n
w w w
1
0 1 1
ˆ
( ,..., )
( ... )
n
n n
f x x
P w w x w x
=
= + + +
- a set of w ei ght i ng vect or s i s t he r esul t
of t he l ear ni ng process
- a par t i al l y def i ned f unct i on, w hi ch
i s an appr oxi mat i on of t he deci si on
r ul e f unct i on
P K Singh, F O E, D E I
m
p
m
1
m
2
m
3
x
i
y
i
n
9
( ) { } t f
f
p n
÷
9 ¬ 9
F
:
p
9
1. Quant i zat i on of pat t er n space i nt o
p deci si on cl asses
Mat hemat ical Int erpret at ion of Classif icat ion in
Decision Making
m
3
Input Pat t er ns
Response:
( )
( )
( )












=
1
1
2
1
1
n
x
x
x

i
x
( )
( )
( )














=
1
1
2
1
1
n
y
y
y

i
y
2. M at hemat i cal model of quant i zat i on:
“ Lear ni ng by Exampl es”
P K Singh, F O E, D E I
Applicat ion of Ar t if icial Neural Net work in Fault
Det ect ion St udy of Bat ch Est erif icat ion Process
 The complexity of most chemical industry always tends to create a problem in
monitoring and supervision system.
 Prompt fault detection and diagnosis is a best way to handle and tackle this problem.
 There are different methods tackling different angle. One of the popular methods is
artificial neural network which is a powerful tool in fault detection system.
 In this, a production of ethyl acetate by a reaction of acetic acid and ethanol in a
batch reactor is applied. batch reactor is applied.
 The neural network with normal and faulty event is executed on the data collected
from the experiment.
 The relationship between normal-faulty events is captured by training network
topology.
 The ability of neural network to detect any process faults is based on their ability to
learn from example and requiring little knowledge about the system structure.
P K Singh, F O E, D E I
 CONCLUSION Fault diagnosis for pilot-plant batch esterification process is
investigated in this work by a feed forward neural model by implementing multilayer
perceptron. The effect of catalyst concentration and catalyst volume are studied and
classified successfully using the neural process model. The results displayed that
neural network is able to detect and isolate two fault studies with a nice pattern
classification. P K Singh, F O E, D E I
Temperat ure cont rol in ferment ers: applicat ion of
neural net s and feedback cont rol in breweries
 The main objective of on-line quality control in fermentation is to perform the production
processes as reproducible as possible.
 Since temperature is the main control parameter in the fermentation process of beer
breweries, it is of primary interest to keep it close to the predefined set point. Here, we
report on a model-supported temperature controller for large production-scale beer
fermenters.
 The dynamic response of the temperature in the tank on temperature changes in the cooling  The dynamic response of the temperature in the tank on temperature changes in the cooling
elements has been modeled by means of a difference equation.
 The heat production within the tank Is taken into account by means of a model for the
substrate degradation.
 Any optimization requires a model to predict the consequences of actions. Instead of using a
conventional mathematical model of the fermentation kinetics, an artificial neural network
approach has been used.
 The set point profiles for the temperature control have been dynamically optimized in order
to minimize the production cost while meeting the constraints posed by the product quality
requirements.
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
P K Singh, F O E, D E I
Ar t i f i ci al
Int elligent
Cont rol
s
Technical
Diagnist ic
s
Int elligent
Dat a Analysis
and Signal
Advance
Robot ics
Machine
Vision
Applicat ions of Ar t if icial Neural Net works
Ar t i f i ci al
Int el l ect w i t h
Neur al
Net w or ks
and Signal
Processing
Vision
Image &
Pat t ern
Recognit ion
Int elligent
Securit y
Syst ems
Devices
Int elligent
l
Medicine
Devices
Int elligent
Expert
Syst ems
P K Singh, F O E, D E I
Applicat ions: Classif icat ion
Business
• Credit rat ing and risk assessment
• I nsurance risk evaluat ion
• Fraud det ect ion
• I nsider dealing det ect ion
• Market ing analysis
• Signat ure verificat ion
• I nvent ory cont rol
Security
• Face recognit ion
• Speaker verificat ion
• Fingerprint analysis
Medicine
• I nvent ory cont rol
Engineering
• Machinery defect diagnosis
• Signal processing
• Charact er recognit ion
• Process supervision
• Process fault analysis
• Speech recognit ion
• Machine vision
• Speech recognit ion
• Radar signal classificat ion
Medicine
• General diagnosis
• Det ect ion of heart defect s
Science
• Recognising genes
• Bot anical classificat ion
• Bact eria ident ificat ion
P K Singh, F O E, D E I
Applicat ions: Modeling
Business
• Predict ion of share and commodit y prices
• Predict ion of economic indicat ors
• I nsider dealing det ect ion
• Market ing analysis
• Signat ure verificat ion
• I nvent ory cont rol
Science
Engineering
• Transducer linerisat ion
• Colour discriminat ion
• Robot cont rol and navigat ion
• Process cont rol
• Aircraft landing cont rol
• Car act ive suspension cont rol
• Print ed Circuit aut o rout ing
• I nt egrat ed circuit layout
• I mage compression
Science
• Predict ion of t he performance of
drugs from t he molecular st ruct ure
• Weat her predict ion
• Sunspot predict ion
Medicine
• . Medical imaging
and image processing
P K Singh, F O E, D E I
Applicat ions: Forecast ing
• Fut ure sales
• Product ion Requirement s
• Market Performance
• Economic I ndicat ors
• Energy Requirement s • Energy Requirement s
• Time Based Variables
P K Singh, F O E, D E I
Applicat ions: Novelt y Detect ion
• Fault Monit oring
• Performance Monit oring
• Fraud Det ect ion
• Det ect ing Rat e Feat ures
• Different Cases
P K Singh, F O E, D E I
Thank you
For any suggestion …..
http://pksingh.webstarts.com/student_community.html
P K Singh, F O E, D E I

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close