Starting from:

$30

HW2 Solution

Problem 1 (30pt): [Perceptron Algorithm]




We consider the linear discriminant function for boolean function f(x1; x2). x1 and x2 can be either 0 (false) or 1 (true). Think of boolean function as having 4 points on the 2D plane (x1 being the horizontal axis and x2 being the vertical axis). For AND function, we de ned as f(0; 0) = false, f(1; 0) = false, f(0; 1) = false, and f(1; 1) = true where true can be treated as positive class and false can be treated as negative class.




[5pt] For boolean AND function, is the negative class and positive class linearly separable?



[25pt] Is it possible to apply the perceptron algorithm to obtain the linear decision boundary that correctly classify both the positive and negative classes? If so, write down the updation steps and the obtained linear decision boundary. (Suppose the initial decision boundary
is x1 + x2
1
= 0, and you could start from the top-left point and sweep the whole data points
2







in clockwise manner. Note that you can not write down the arbitrary linear boundary without updation steps. )




Problem 2 (70pt): [Principal Component Analysis, Dimension Reduction, Eigenface]




Face modelling serves as one of the most fundamental problems in modern arti cial intelligence and computer vision, and can be useful in various applications like face recognition, identi cation etc. However, face images are usually of high-dimensional (e.g., a small 100 100 gray-scaled face image has dimension 100 100 = 10; 000), therefore, nd a suitable representation is utterly important. In this problem, we apply the linear model, principal component analysis (PCA), on face images to reduce the dimension and obtain eigenface representations.




Dataset: we use the datasety which contains 177 face images. Each image contains 256 256 pixels and is gray-scaled (i.e., the value for each pixel is stored as unsigned integer between [0; 255], typically, 0 is taken to be black and 255 is taken to be white). You need to split the dataset to be train/test set, e.g., you could use the rst 157 images for training, and the rest 20 faces for testing.




[30pt] Write the PCA codes to compute K = 30 eigenfaces and visualize the top 10 eigen-



faces.




[20pt] Use the learned K eigenfaces from (1) to reconstruct testing images, and record
^


2




the reconstruction error. The reconstruction error can be de ned as
jjY
Y jj


, where Y^
is the




N









reconstructed face using the learned eigenfaces, Y is the testing faces and N is the total number of testing data. Please show 5 testing images and their reconstructed ones.




[20pt] Try di erent values of K, e.g., try K = 10; 30; 50; 100; 150:::, and draw the curve to indicate the corresponding testing reconstruction errors. The x-axis of the curve can be di erent



K values, and the y-axis can be testing reconstruction error de ned in (2).










The dataset is coming from S.C. Zhu’s previous Stats 231: Pattern Recognition and Machine Learning in UCLA.



1
Some useful hints:




To construct the data matrix for PCA, you can reshape each image to a long vector. For example, the original 2D image of size [h; w] becomes 1D vector of size h w. Each row of the data



matrix represents one face image and data matrix has size [Ntrain; D] where Ntrain is the number of training samples and D = h w is the dimension of the reshaped image.




(2) To compute the PCA, you need to subtract the mean image from each training image. To get the mean image, just sum up all the training images and divide it by Ntrain.




You could use from PIL import Image, Image.open to read the images, and use mat-plotlib.pyplot to show the images. Feel free to use other functions to read and process the images as well.



For PCA, you could use linalg from numpy for eigendecomposition. Other eigenvalue decom-position/SVD functions can also be used. However, you can not directly call the pca functions in any languages for this problem.







































































































































2

More products