questionneural and tree learning on continuous


Question

Neural and Tree Learning on Continuous Attributes

(a) In general, feedforward neural networks (multi-layer perceptrons) trained by error back-propagation are:

(i) fast to train, and fast to run on unseen examples

(ii) slow to train, and fast to run on unseen examples

(iii) fast to train, and slow to run on unseen examples

(iv) slow to train, and slow to run on unseen examples

In one sentence, explain your choice of answer.

Suppose you have a decision tree (DT) and a multi-layer perceptron (MLP) that have been trained on data sampled from a two-class target function, with all attributes numeric. You can think of both models as graphs whose edges are labelled with numbers: weights in the MLP and threshold constants for attribute tests in the DT.

(b) Compare and contrast the roles of these numbers in the two models.

(c) Compare and contrast the methods of learning these numbers in the two models.

Request for Solution File

Ask an Expert for Answer!!
Data Structure & Algorithms: questionneural and tree learning on continuous
Reference No:- TGS0501608

Expected delivery within 24 Hours