What is the bayes decision rule and the bayes


Problem 1:

1. Consider the following set of two dimensional vectors from three categories:

ω1 ω2 ω3
x1 x2 x1 x2 x1 x2
10 0 5 LO 2 8
0 - I 0 0 5 -5 2
5 -2 5 5 10 -4

(a) Plot the decision boundary resulting from the nearest-neighbor rule just for categorizing ω1 and ω2. Find the sample means nil and m2 and on the same figure sketch the decision boundary corresponding to classifying x by assigning it to the category of the nearest sample mean.

(b) Repeat part (a) for categorizing only ω1 and ω3.

(c) Repeat part (a) for categorizing only ω2 and ω3,

(d) Repeat part (a) for a three-category classifier, classifying ω1, ω2 and ω3.

Need code for plots as well in MATLAB

Problem 2:

Consider classifiers based on samples with priors P(ω1) = P(ω2) = 0.5 and the distributions

                  2x for 0 ≤ x ≤ 1

p(x|ω1) =

                    0 otherwise,

                  2 - 2x for 0 ≤ x ≤ 1

p(x|ω2) =

                    0 otherwise,

(a) What is the Bayes decision rule and the Bayes classification error?

(b) Suppose we randomly select a single point from col and a single point from 41)2, and create a nearest-neighbor classifier. Suppose too we select a test point from one of the categories (to] for definiteness). Integrate to find the expected error rate Pi (e).

(c) Repeat with two training samples from each category and a single test point in order to find P2(e).

(d) Generalize to show that in general.

Pn(e) = 1/3  + 1/((n+1)(n + 3) + 1/2(n + 2)(n + 3)

Confirm this formula makes sense in the n = 1 case.

(e) Compare limn→∞Pn(e)with the Bayes error.

Problem 3

Repeat Problem 2 but with

                3/2 for 0 ≤ x ≤ 2/3

p(x|ω1) =

                    0 otherwise,

                  3/2  for 1/3 ≤ x ≤ 1

p(x|ω2) =

                    0 otherwise,

Request for Solution File

Ask an Expert for Answer!!
Basic Statistics: What is the bayes decision rule and the bayes
Reference No:- TGS01360711

Expected delivery within 24 Hours