JNTU B.Tech II Semester Examinations, NEURAL NETWORKS, Apr/May 2008

JNTU B.Tech II Semester Examinations, NEURAL NETWORKS, Apr/May 2008

(Computer Science & Engineering)

Time: 3 hours Max Marks: 80

Answer any FIVE Questions

All Questions carry equal marks

SET-II

1. Explain the following:

(a) Single layer feed forward networks

(b) Stable forward weight. [8+8]

 

2. Write briefly about the following:

(a) Correlation matrix memory

(b) Linear adaptive filter. [8+8]

 

3. Explain the following briefly

(a) Steepest descent method

(b) Newton’s method for optimization. [8+8]

 

4. Consider the following optimized multilayer perceptron parameter value optimum number of hidden neurons 2 optimum learning-rate parameter 0.1 optimum momentum constant 0.5 Using above data explain optimal network design. [16]

 

5. Statistical criterion for model selection, such as Rissanen’s minimum description

length (MDL) criterion and an information-theoretic criterion (AIC) due to Akaike,

share a common form of composition:

(model-complexity criterion) = (log-likelihood function) + (model-complexity penalty)

Discuss how the weight-decay and weight-elimination methods used for network pruning fit into this formalism. [16]

 

6. (a) Write about Willshaw-Vonder malsburg’s model of self organized feature map

(b) Write short notes on parameter specifications for the computer simulations of self organization map algorithm. [8+8]

 

7. (a) Discuss about stability and convergence in the context of an autonomous non-linear dynamical system with equilibrium state.

(b) Draw and explain block diagram of related model. [8+8]

 

8. (a) Explain the working of a Hopfield network with a neat sketch of its architecture

 (b) Taking a three-node net, determine the weight matrix to store the following states V1 V2V3 = 000, 011, 110 and 101 using Hebb’s rule. [8+8]

Leave a Comment