Neural+Networks



[|http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html] [|Neural network] [|Artificial neural network]

An **artificial neural network (ANN)**, usually called "neural network" (NN), is a mathmatical model or computational model that tries to simulate the structure and/or functional aspects of biological neural networks.

Traditional AI programs from 1960s 1.logic theorist(newell shaw and simon)-proved many math theorems via algorithims. e.g. replace problem with series of simpler subproblems. 2.general problem solver(newell and simon)-used heuristics(think aloud protocols) and algorithms to solve problems. 3. elementary perceiver and memorizer(epam, feigenbuam and simon)-verbal learning behavior(wrote learning of nonsense syllables) using a discrimination net.

different types of NN possible 1. single layer or multi-layer architecture(hopfield,kohonen). 2. data processing. thru network. feedforward recurrent 3. variations in nodes number of nodes types of connections among nodes in network 4.learning algorithms supervised unsupervised (self-orginized) back propagation learning (training) 5. implementation software or hardware

Neural network computers differ from traditional computers because they cannot be programmed, they must learn from experience. In this way they are able to solve problems that serial computers cannot. They are also capable of making judgements and filling in missing or distorted information.

· Know enough neuroscience to understand why computer models make certain approximations. · Understand when approximations are good & when bad. o Know tools of formal analysis for models. o Some simple mathematics. o Access to simulator or ability to program. · Know enough cognitive science to have some idea of about what the system is supposed to do. · Uses massive interconnection of simple computing cells (neurons or processing units). · Acquires knowledge thru learning. · Modify synaptic weights of network in orderly fashion to attain desired design objective. Neural Networks are composed of nodes & connections · Similar to neurons – receive inputs from other sources. · Excitatory inputs tend to increase neuron’s rate of firing. · Inhibitory inputs tend to decrease neuron’s rate of firing. o Firing rate changes via real-valued number (**activation**). o Input to node comes from other nodes or from some external source. · Inputs to node usually summed ( S ). · Net input passed thru activation function ( **f(net)** ); often a sigmoid function · Produces node’s activation which is sent to other nodes. · Each input line (connection) represents flow of activity from some other neuron or some external source · Connections between different nodes can have different potency (**//connection strength//**) in many models. · Strength represented by real-valued number (connection weight). · Input from one node to another is multiplied by connection weight. · If **//connection weight//** is  o Negative number – input is inhibitory. o Positive number – input is excitatory Each input (from different nodes) is calculated by multiplying **activation value of input node by weight on connection** (from input node to receiving node). 1. Arrange neurons in various layers. 2. Decide type of connections among neurons for different layers, as well as among neurons within layer. 3. Decide way a neuron receives input & produces output. 4. Determine strength of connection within network by allowing network to learn appropriate values of connection weights via training data set. Learning -- **Backpropagation of Error (Rummelhart, Hinton & Williams, 1986)** · AKA Generalized Delta Rule. (δ) · Begin with network which has been assigned initial weights drawn at random. o Usually from uniform distribution with mean of 0.0 & some user-defined upper & lower bounds ( ±1.0). · User has set of training data in form of input/output pairs. · Goal of training -- learn single set of weights such that any input pattern will produce correct output pattern. o Desired if weights allow network to generalize to novel data not seen during training. Different types of NN available: · Single layer or multi-layer architectures (Hopfield, Kohonen). · Data processing. thru network. o Feedforward. o Recurrent. · Variations in nodes. o Number of nodes. o Types of connections among nodes in network. · Learning algorithms. o Supervised. o Unsupervised (self-organizing). o Back propagation learning (training). · Implementation. o Software or hardware. 3 Ways Developmental Models Handle Change 1. Development results from working out predetermined behaviors. Change is the triggering of innate knowledge. 2. Change is inductive learning. Learning involves copying or internalizing behaviors present in the environment. 3. Change arises through interaction of maturational factors, under genetic control, and environment. Examples of connectionist models & ANN: 1. speed of reading words 2. U-shape learning when learn past tense of verbs 3. face recognition from various angles (View invariance) 1. Explicitness – constructing model of theory & implementing it as computer program requires great level of detail. 2. Prediction – difficult to predict consequences of model due to interactions between different parts of model. Connectionist models are non-linear. 3. Discover & test new experiments & novel situations. 4. Practical reasons why difficult to test theory in real world. · Systematically vary parameters thru full range of possible values. 5. Help understand why a behavior might occur. · Simulations open for direct inspections à explanation of behavior. · Computer models are metaphors of human behavior. · Must consider neural plausibility **__and__** psychological plausibility of model. · Model learning behaviors similar to humans. · Give models same kind of info humans get. · No guarantees that model is accurate mirror of human. Close correspondence. · However, computer models can liberate us from biases & preconceptions 1. Global error. 2. Individual pattern error. 3. Analyzing weights & internal representations. · Hierarchical clustering of hidden unit activations. · Principal component analysis & projection pursuit. · Activation patterns in conjunction with actual weights. Boolean Functions AND, OR, XOR & how to implement them in tlearn
 * __ Artificial Neural Networks __**
 * Approaches to studying brain **
 * // Neural network //** -- “a machine that is designed to model the way in which the brain performs a particular task or function of interest” (Haykin, 1994, pg. 2).
 * Nodes ** – simple processing units.
 * Connections ** -- Input travels along **//connection lines//**.
 * neti = **** S **** wijaj Net input to node i **
 * j **
 * S ** = sigma (summation)
 * i ** = receiving node
 * aj ** = activation on nodes sending to node i
 * wij ** = weights on connection between nodes j & i.
 * Steps in Designing a Neural Network **
 * Why Build Models? “… a model is simply a detailed theory.” **
 * Caution is Required With Models **
 * Ways to evaluate network performance: **