**B.E. (Computer) PATTERN RECOGNITION (2008 Pattern) (Semester – II) (Elective – III)**

Time :3 Hours] [Max. Marks :100

Instructions to the candidates :-

** 1) **Attempt: Q. 1 or Q. 2, Q. 3 or Q. 4, Q. 5 or Q. 6 from section I.

Attempt: Q. 7 or Q. 8, Q. 9 or Q. 10, Q. 11 or Q. 12 from section II.

** 2) **Answers to the two sections should be written in separate answers books.

** 3) **Neat diagrams must be drawn wherever necessary.

** 4) **Pigures to the right indicate full marks.

** 5) **Assume suitable data, if necessary.

SECTION – I

QI) a) Differentiate supervised learning and unsupervised learning. [8]

b) Explain the concept of feature extraction in pattern recognition system with examples. [8]

OR

Q2) a) What are the problems arise by activities in design of pattern recognition System? [8]

b) Explain the concept of Classification and Post processing in pattern recognition. [8]

Q3) a) Write a short note on Minimum error rate classification. [8]

b) With the help of suitable diagram explain classifiers and functional structure of general statistical pattern classifier. [10]

OR

Q4) a) Explain the uni-variate and multivariate normal density functions with examples. [10]

b) What are challenges in Bayesian decision theory? [8]

Q5) a) Discuss the general principal of Maximum likelihood estimation. [8] b) Write a short note on General theory of Bayesian Parameter estimation. [8]

OR

Q6) a) Write Expectation Minimization (EM) algorithm. Explain EM for 2D normal model. [8]

b) Illustrate a Gaussian mixture distribution in one dimension and also illustrate a mixture of three Gaussian in 2 dimensional space. [8]

SECTION – II

Q7) a) Write HMM Decoding algorithm. With the help of example explain the state sequence decoding of hidden Markov model. [8]

b) Explain Principal Component Analysis (PCA) with analytical treatment. [8]

OR

Q8) a) Consider training and HMM by the forward and backward algorithm for a single sequence of length T where each symbol could be one of c values. What is the computational complexity of a single revision of all values aij and b ij. [8]

b) Write a short note on Fisher-Linear Discriminant. [8]

Q9) a) Write a short note on support vector machine. [8]

b) Explain 2 category and multi category case of linear discriminant functions. Also explain linear decision bounding for 4 class problem with the help of suitable diagram. [10]

OR

QI0) a) Explain Parzen window approach for density estimation. State and explain examples of 2dimentional circularly symmetric normal Parzen window for 3 different values of h. [8]

b) Explain following scattered criteria’s with the help of suitable examples. [10]

i) The scattered matrices ii) The trace criteria

i) Determinant criteria iv) Invariant criteria

QII) a) When a test pattern is classified by a decision tree, that pattern is subjected to a sequence of queries, corresponding to the nodes along a path from root to leaf ? Prove that for any decision tree. [8]

b) Write algorithm for K-means clustering with the help of diagram. Explain how the K-means clustering produces a form of stochastic hill climbing in the log likelihood function. [8]

OR

Q12) a) Write a short note on application of normal mixture. [8]

b) Explain following criteria functions for clustering : [8]

i) The sum of squared error.

ii) Related minimum variance.