Architectures for Complex Pattern and Applications of ANN

This quiz contains multiple-choice problems on associative networks, neural network applications and the concepts of feedforward neural networks.

Start Quiz

What tasks cannot be realised or recognised by simple networks?

Handwritten characters

Speech sequences

Image sequences

All of the above

What does a network become if the weight matrix stores multiple associations among several patterns?

Auto-assoiative memory

Heteroassociative memory

Multi-directional associative memory

Temporal associative memory

What does a network become if the weight matrix stores the given patterns?

Auto-assoiative memory

Heteroassociative memory

Multi-directional associative memory

Temporal associative memory

What does a network become if the weight matrix stores an association between a pair of patterns?

Auto-assoiative memory

Heteroassociative memory

Multi-directional associative memory

Temporal associative memory

What does the network become if the weight matrix stores an association between adjacent pairs of patterns?

Auto-associative memory

Heteroassociative memory

Multi-directional associative memory

Temporal associative memory

What is heteroassociative memory's other name?

Uni-directional memory

Bi-directional memory

Multi-directional associative memory

Temporal associative memory

What are some desirable characteristics of associative memories?

Ability to store large number of patterns

Fault tolerance

Able to recall, even for input pattern is noisy

All of the above

What is the objective of BAM?

To store pattern pairs

To recall pattern pairs

To store a set of pattern pairs which can be recalled by giving either pattern as input

None of the above

Is BAM a special case of MAM?

Yes

No

What is the use of MLFFNN?

To realize the structure of MLP

To solve pattern classification problems

To solve pattern mapping problems

To realize an approximation to a MLP

What is the advantage of basis function over mutilayer feedforward neural networks?

Training of basis function is faster than MLFFNN

Training of basis function is slower than MLFFNN

Storing in basis function is faster than MLFFNN

None of the above

Pattern recall takes more time for

MLFNN

Basis function

Equal for both MLFNN and basis function

None of the above

Why is basis function training faster than MLFFNN?

Because they are developed specifically for pattern approximation

Because they are developed specifically for pattern classification

Because they are developed specifically for pattern approximation or classification

None of the above

For what type of networks is training avoided completely?

GRNN

PNN

GRNN and PNN

None of the above

Can data be stored directly in associative memory?

Yes

No

Quiz/Test Summary
Title: Architectures for Complex Pattern and Applications of ANN
Questions: 15
Contributed by:
Ivan