0% found this document useful (0 votes)
203 views

HW4 Pattern Classification

This document contains instructions for two problems related to pattern recognition and classification homework. [1] The first problem uses k-nearest neighbors to estimate class conditional densities and classify a data point using datasets from a previous homework. [2] The second problem involves plotting sample data from two classes in 2D and 3D, then using batch perceptron learning to find the weight vector for a generalized linear discriminant function to classify the data.

Uploaded by

juan carlos
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
203 views

HW4 Pattern Classification

This document contains instructions for two problems related to pattern recognition and classification homework. [1] The first problem uses k-nearest neighbors to estimate class conditional densities and classify a data point using datasets from a previous homework. [2] The second problem involves plotting sample data from two classes in 2D and 3D, then using batch perceptron learning to find the weight vector for a generalized linear discriminant function to classify the data.

Uploaded by

juan carlos
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 1

Stevens Institute of Technology

Department of Electrical and Computer Engineering

CpE 646 Pattern Recognition and Classification

Homework 4

Problem 1: Once again we use the data sets “hw3_2_1” and “hw3_2_2” from
Homework 3. The sample vectors in “hw3_2_1” are from class ω1 and sample vectors in
“hw3_2_2” are from class ω2.

Use k-nearest neighbor method to estimate the class conditional density functions p(x|ω1)
and p(x|ω2) for every x in {-4:0.1:8, -4:0.1:8}; use “mesh” function in Matlab to plot the
results; and then classify x=[1,-2]t based on the estimation. Let k=10.

(Hint: The “sort” function in Matlab can be used to find the closest neighbors.)

Problem 2: Now we use the data sets “hw4_2_1” and “hw4_2_2” from file “hw4.mat”.
The sample vectors in “hw4_2_1” are from class ω1 and sample vectors in “hw4_2_2”
are from class ω2. Each of these matrices has a size of 2×100, i.e. 2 rows and 100
columns. Each column is a 2-D observation vector.

2.1 Plot the data set in 2-D use Matlab

�x1 �
�x1 � � �
2.2 Assume a  function, which projects each input vector x = � �to xˆ = �x2 �,
x
�2 � �
�x1 x2 �

plot x̂ in 3-D use Matlab function “plot3”.
�1 �
�x �
2.3 Now define an augmented vector as y ( x ) = � �. Use the Batch Perceptron
1

�x2 �
� �
�x1 x2 �
a0 �


a1 �
method (page 35 and 39, CPE646-9) to find the weight vector a = � �in the

a2 �
��
a3 �

generalized linear discriminant function (page 22, 23. CPE646-9).
(Hint: let  =1,  =1, initialize a (0) = �y )

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy