Neural Networks
The neural network is a system composed of much simple processing elements in parallel which can acquire, store and utilize experimental knowledge. The components of neural networks are weight, threshold or bias, summing unit and activation function. There is a weight associated with each input/ output.A neuron usually receives many simultaneous inputs. Each input has an own relative weight which gives impact to the input that it needs to the processing element's summation function. A threshold value is a minimum value that is predefined with a bias value. The threshold, or transfer function, is generally non-linear. Linear or straight-line functions are limited because the output is simply proportional to the input. The first step in a processing element's operation is to compute the weighted sum of all of the inputs. This unit produces weighted sum of binary input/ output. Activation function determines whether the neural unit fires or not.
Summary
The neural network is a system composed of much simple processing elements in parallel which can acquire, store and utilize experimental knowledge. The components of neural networks are weight, threshold or bias, summing unit and activation function. There is a weight associated with each input/ output.A neuron usually receives many simultaneous inputs. Each input has an own relative weight which gives impact to the input that it needs to the processing element's summation function. A threshold value is a minimum value that is predefined with a bias value. The threshold, or transfer function, is generally non-linear. Linear or straight-line functions are limited because the output is simply proportional to the input. The first step in a processing element's operation is to compute the weighted sum of all of the inputs. This unit produces weighted sum of binary input/ output. Activation function determines whether the neural unit fires or not.
Things to Remember
- The neural network is a system composed of much simple processing elements in parallel which can acquire, store and utilize experimental knowledge.
- The components of neural networks are weight, threshold or bias, summing unit and activation function.
- There is a weight associated with each input/ output.A neuron usually receives many simultaneous inputs. Each input has the own relative weight which gives impact to the input that it needs to the processing element's summation function.
- A threshold value is a minimum value that is predefined with a bias value. The threshold, or transfer function, is generally non-linear.
- The first step in a processing element's operation is to compute the weighted sum of all of the inputs. This unit produces a weighted sum of binary input/ output.
- Activation function determines whether the neural unit fires or not.
MCQs
No MCQs found.
Subjective Questions
No subjective questions found.
Videos
No videos found.

Neural Networks
Neural networks
It is a system composed of the much simple processing element in parallel which can acquire, store and utilise experimental knowledge. The components of neural networks are presented below:
- Weight: There is a weight associated with each input/ output.A neuron usually receives many simultaneous inputs. Each input has an own relative weight which gives impact to the input that it needs to the processing element's summation function. These weights perform the similar function as the varying synaptic strengths of biological neurons do.
- Threshold/ Bias: A threshold value is a minimum value that is predefined with a bias value.The threshold, or transfer function, is generally non-linear. Linear or straight-line functions are limited because the output is simply proportional to the input.
- Summing unit: The first step in a processing element's operation is to compute the weighted sum of all of the inputs. This unit produces a weighted sum of binary input/ output. Mathematically the inputs and the corresponding weights are the vectors which can be represented as (x1, x2 . . . xn) and (w1, w2 . . . wn). The total input signal is the dot or inner, product of these two vectors. That is,
sum = x1w1 + x1w1 +x2w2 +x3w3 +...........+xnwn. - Activation function: It determines whether the neural unit fires or not.
References:
- Elaine Rich, Kevin Knight 1991, "Artificial Intelligence".
- Nilsson, Nils J. Principles of Artificial Intelligence, Narosa Publishing House New Delhi, 1998.
- Norvig, Peter & Russel, Stuart Artificial Intelligence: A modern Approach, Prentice Hall, NJ, 1995
- Patterson, Dan W. Introduction to Artificial Intelligence and Expert Systems, Prentice Hall of India Private Limited New Delhi, 1998.
Lesson
Applications of AI
Subject
Computer Engineering
Grade
Engineering
Recent Notes
No recent notes.
Related Notes
No related notes.