0% found this document useful (0 votes)
21 views2 pages

BM6701_IQ_R17

Papers

Uploaded by

utkarshjohari863
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views2 pages

BM6701_IQ_R17

Papers

Uploaded by

utkarshjohari863
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Anna University Exams Regulation 2017

Rejinpaul.com Anna University Semester Exam Important Questions


BM6701 PATTERN RECOGNITION AND NEURAL NETWORKS
PART B & PART C QUESTIONS

om
1. With neat example explain the concept of k- nearest neighbor classifier
2. Define Discriminant function. Explain the concept of using decision boundary for two class problems.
3. How kernel and window estimators are used in estimation of probability density function.
4. Perform hierarchical clustering of the following patterns (1, 3) (2, 4) (1, 0.5) (3, 5) (4, 7) using average
linkage algorithm.
5. Perform a Partitional clustering of the data using k-means algorithm. Set K =2 and use the first two

.c
samples in the list as seed points. Show the values of the centroids and the nearest seed points.
Sample X Y
1 0.0 0.0
2 0.5 0.0

ul
3 0.0 2.0
4 2.0 2.0
5 2.5 8.0
pa
6 6.0 3.0
7 7.0 3.0
6. Discuss the complete linkage algorithm with an example.
7. Elaborate k means algorithm with suitable example
8. Explain the method of estimating probability density estimation function using histogram.
jin

9. Explain ADALINE network with neat sketch.


10. a) Explain the difference between artificial neural network and biological neural network.
b) Discuss the various types of Neural Network architecture.
11. Explain the algorithm for BAM neural network and give its activation function.
12. (i)Use Hebb rule to find the weight matrix to store the following (binary) input output pattern pairs
.re

x(1) = (1 0 1) y(1) = (1 0)
x(2) = (0 1 0) y(2) = ( 0 1)
(ii)Using the binary step function (with threshold 0) as the activation function for both layers, test the
response of the network in both directions on each of the binary training patterns. In each case, when
presenting input pattern to one layer, the initial activation of other layer is set to zero.
w

13. (i) Explain the architecture of MADALINE.


(ii) Provide suitable applications of MADALINE
14. Draw the architecture of back propagation network. Explain the training algorithm of BPN.
15. Use Hebb rule, find the weights required to perform the following classification of the given input
w

patterns shown in fig below. The pattern is shown as 3 x 3 matrix form in the squares. The “+” symbol
represent the value “1” and empty squares indicate “-1”. Consider “I” belongs to the member of the
class (target value 1) and “O” does not belong to the members of the class (target value -1)
w

+ + + + + +
+ + +
+ + + + + +
“I” “O”
16. Construct and test a BAM network to associate letters E and F with simple bipolar input and output
vectors. The target output for E is (-1, 1) and for F is (1, 1). The display matrix is 5 x 3.
* * * * * *
* . . * . .
* * * * * *
* . . * . .
* * * * . .
“E” “F”
17. Construct a perceptron network to implement the AND function and state its limitation.
18. Check the auto associative network for input vector [1 1 -1]. Form the weight vector with no self –
connection. Test whether the net is able to recognize with one missing entry.
a. [1 0 -1]

om
b. [1 1 0]
19. Draw the basic architecture of discrete Hopfield network and give its training algorithm.
20. Explain the application of Kohonen SOMusing travelling sales man problem.
21. Find the weight matrix of a BAM network (with bipolar vectors) to map 2 simple letters (given by 5 x 3
patterns) to the following bipolar codes. Also obtain the response of the system with E and H as input.
Target for E (-1,1) and H(1,1)

.c
22. Draw the architecture and explain the training algorithm of bidirectional associative memory in detail.
23. Give the algorithm for full CPN. Write the applications of CPN.
24. Give the architecture of full counter propagation network and explain how the network is trained.
25. A Kohonen SOM has 2 inputs and 5 cluster units. Find the new weights for the network when the input

ul
vector is (0.3,0.4). use a learning rate of 0.3, the weights are given as w1(0.2,0.6,0.4,0.4,0.2) and
w2(0.3,0.5,0.7,0.6,0.8). Also find the new weights if Cj-1 and Cj+1 are allowed to learn Cj is the winning
node.
26. A Hopfield network made up of 5 neurons, which is required to store the following three fundamental
pa
memories. E1 = [+1, +1, +1, +1, +1]T; E2 = [+1, -1, -1, +1, -1]T; E3 = [-1, +1, -1, +1, +1]T. Evaluate 5 by 5
synaptic weight of the matrix

Questions Are Expected for University Exams This May or may Not Be Asked for Exams
Please do not Copy (or) Republish these Questions, Students if You Find the Same Questions in Other Sources,
jin

Kindly report us to [email protected]


.re
w
w
w

You might also like