Starting from:

$25

SML-Homework 1 Solved

Q1. Consider the following decision rule for a two-category one-dimensional problem:

Decide ω1 if x θ; otherwise decide ω2.

(a)  Show that the probability of error for this rule is given by

                                 (1)

(b)  By differentiating, show that a necessary condition to minimize P(error) isthat θ satisfy p(θ|ω1)P(ω1) = p(θ|ω2)P(ω2)

Q2. Let the conditional densities for a two-category one-dimensional problem be given by the Cauchy distribution

                                        2                                    (2)

Assuming P(ω1) = P(ω2), show that P(ω1|x) = P(ω2|x) if x = (a1 +a2)/2, i.e., the minimum error decision boundary is a point midway between the peaks of the two distributions, regardless of b.

Q3. Suppose we have three equi-probable categories in two dimensions with the following underlying distributions:



By explicit calculation of posterior probabilities, classify the point x =

for minimum probability of error.

1

Q4. a. Write a procedure to generate random samples according to a normal distribution N(µ,Σ) in d dimensions.

b.      Write a procedure to calculate the discriminant function for a given

normal distribution with Σ = σ2I and prior probability P(ωi).

c.      Compare the discriminant function’s values for two different distributions

) and   = 2 dimensions.

Assume the test sample to be and P(ω1) = 1/3 and P(ω2) = 2/3.

In a general process, you would be given several samples from two (or more) classes. Counting each class’ frequency will give the priors. With these samples as d dimensional vectors, you can estimate mean and covariance using MLE or other techniques, which is a part of later lecture. This computed info is sufficient for computing discriminants and thereby classifying the sample into one of the classes.

More products