Starting with your answer to part b use gradient descent to


The following training set obeys the rule that the positive examples all have vectors whose components sum to 10 or more, while the sum is less than 10 for the negative examples.

1441_56df2f73-09db-41e6-9bcb-992f4be2a63b.png

(a) Which of these six vectors are the support vectors?

(b) Suggest a vector w and constant b such that the hyper plane defined by w.x + b = 0 is a good separator for the positive and negative examples. Make sure that the scale of w is such that all points are outside the margin; that is, for each training example (x, y), you have y(w.x + b) ≥ +1.

(c) Starting with your answer to part (b), use gradient descent to find the optimum w and b. Note that if you start with a separating hyper plane, and you scale w properly, then the second term of Equation 12.4 will always be 0, which simplifies your work considerably.

 

 

Request for Solution File

Ask an Expert for Answer!!
Basic Computer Science: Starting with your answer to part b use gradient descent to
Reference No:- TGS01605069

Expected delivery within 24 Hours