Compare and contrast the methods of learning these numbers


Neural and Tree Learning on Continuous Attributes

(a) In general, feedforward neural networks (multi-layer perceptrons) trained by error back-propagation are:

(i) fast to train, and fast to run on unseen examples
(ii) slow to train, and fast to run on unseen examples
(iii) fast to train, and slow to run on unseen examples
(iv) slow to train, and slow to run on unseen examples

In one sentence, explain your choice of answer.

Suppose you have a decision tree (DT) and a multi-layer perceptron (MLP) that have been trained on data sampled from a two-class target function, with all attributes numeric. You can think of both models as graphs whose edges are labelled with numbers: weights in the MLP and threshold constants for attribute tests in the DT.

(b) Compare and contrast the roles of these numbers in the two models.
(c) Compare and contrast the methods of learning these numbers in the two models.

 

Solution Preview :

Prepared by a verified Expert
Data Structure & Algorithms: Compare and contrast the methods of learning these numbers
Reference No:- TGS0665140

Now Priced at $30 (50% Discount)

Recommended (98%)

Rated (4.3/5)