Neural Networks Questions and Answers - Determination of Weights

1. Feedforward network are used for pattern storage?
a) yes
b) no

Answer: b
Explanation:Feedforward network are used for pattern mapping, pattern association, pattern classification.

2. If some of output patterns in pattern association problem are identical then problem shifts to?
a) pattern storage problem
b) pattern classification problem
c) pattern mapping problem
d) none of the mentioned

Answer: b
Explanation: Because then number of distinct output can be viewed as class labels

3. The network for pattern mapping is expected to perform?
a) pattern storage
b) pattern classification
c) genaralization
d) none of the mentioned

Answer: c
Explanation: The network for pattern mapping is expected to perform genaralization.

4. In case of autoassociation by feedback nets in pattern recognition task, what is the behaviour expected?
a) accretive
b) interpolative
c) can be either accretive or interpolative
d) none of the mentioned

Answer: b
Explanation: When a noisy pattern is given , network retrieves a noisy pattern.

5. In case of pattern by feedback nets in pattern recognition task, what is the behaviour expected?
a) accretive
b) interpolative
c) can be either accretive or interpolative
d) none of the mentioned

Answer: a
Explanation:Accretive behaviour is exhibited in case of pattern storage problem.

6. In determination of weights by learning, for orthogonal input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Answer: a
Explanation: For orthogonal input vectors, Hebb learning law is best suited.

7. In determination of weights by learning, for linear input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Answer: b
Explanation: For linear input vectors, widrow learning law is best suited

8. In determination of weights by learning, for noisy input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Answer: d
Explanation: For noisy input vectors, there is no learning law

9. What are the features that can be accomplished using affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned

Answer: d
Explanation: Affine transformations can be used to do arbitrary rotation, scaling, translation

10. What is the features that cannot be accomplished earlier without affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned

Answer: c
Explanation: Affine transformations can be used to do arbitrary rotation, scaling, translation