0% found this document useful (0 votes)
58 views

ID3

This document contains 4 questions regarding decision tree learning and the ID3 algorithm. Question 1 asks to represent boolean functions as decision trees. Question 2 provides a sample training set and asks to calculate entropy, information gain, and construct a decision tree. Question 3 provides another sample training set and repeats similar calculations and tree construction. Question 4 provides a final sample set with continuous attributes and asks to construct a decision tree and rules.

Uploaded by

arjun5005
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
58 views

ID3

This document contains 4 questions regarding decision tree learning and the ID3 algorithm. Question 1 asks to represent boolean functions as decision trees. Question 2 provides a sample training set and asks to calculate entropy, information gain, and construct a decision tree. Question 3 provides another sample training set and repeats similar calculations and tree construction. Question 4 provides a final sample set with continuous attributes and asks to construct a decision tree and rules.

Uploaded by

arjun5005
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 5

CS464 Introduction to Machine Learning

Fall 2010
Questions 2 – Decision Tree Learning

Q1) Give decision trees to represent the following boolean functions:


 A  B

 A  [B  C]
 A XOR B

A1 means A, and A2 means B

 [A  B]  [C  D]
Q2) Consider the following set of training examples:

a) What is the entropy of this collection of training examples with respect to the target function classification?

Entropy = -3/6(log3/6) – 3/6(log3/6) = 1

b) What is the information gain of a2 relative to these training examples?

E(a2 =T) = -2/4log(2/4) – 2/4log(2/4) = 1


E(a2 =F) = -1/2log(1/2) – 1/2log(1/2) = 1

Gain of a2 = E(S) – [4/6E(a2 =T) + 2/6E(a2 =F)]


= 1- [4/6 + 2/6]
=1–1=0

c) Create the decision tree for these training examples using ID3.

Gain(S,a1) > Gain(S,a2)


Q3) Consider the following set of training examples:

Instance Classification Attrb1 Attrb2 Attrb3


1 c1 a T a
2 c1 a T b
3 c2 b F c
4 c1 c T d
5 c3 a F a
6 c3 b T b
7 c2 c F c
8 c2 b T c
9 c1 a T a
10 c1 b F b

a) What is the entropy of this collection of training examples with respect to the target function classification?

Entropy = -5/10log(5/10) – 2/10log(2/10) – 3/10log(3/10) = 1.48

b) What is the information gain of Attrb1 relative to these training examples?

E(Attrb1 = a) = -3/4log(3/4) – 1/4log(1/4) = = 0.811


E(Attrb1 = b) = -2/4log(2/4) – 1/4log(1/4) – 1/4log(1/4) = 1.5
E(Attrb1 = c) = -1/2log(1/2) – 1/2log(1/2) = 1

Gain of Attrb1 = E – 4/10E(Attrb1 = a) – 4/10 E(Attrb1 = b) – 2/10E(Attrb1 = c)


= 0.36

c) Create the decision tree for these training examples using ID3.

d) Convert the decision tree into the rules.

If (Attrb3 = a) (Attrb2 = T) Then Classification = c1


If (Attrb3 = a) (Attrb2 = F) Then Classification = c3
If (Attrb3 = b) (Attrb1 = a) Then Classification = c1
If (Attrb3 = b) (Attrb1 = b) (Attrb2 = T) Then Classification = c3
If (Attrb3 = b) (Attrb1 = b) (Attrb2 = F) Then Classification = c1
If (Attrb3 = c) Then Classification = c2
If (Attrb3 = d) Then Classification = c1
Q4) Consider the following set of training examples:

Instance Classification Attrb1 Attrb2


1 T 4 F
2 T 10 T
3 F 20 F
4 F 34 T
5 F 50 F
6 F 70 F
7 T 76 F
8 T 80 T
9 F 90 T
10 F 92 T

a) Create the decision tree for these training examples using ID3. Note that Attrb1 is a continuous-valued attribute.

b) Convert the decision tree into the rules.

If (Attrb15 < 15) Then Classification = True


If (Attrb15 > 15) (Attrb73 < 73) Then Classification = False
If (Attrb15 > 15) (Attrb73 > 73) (Attrb85 < T) Then Classification = True
If (Attrb3 = b) (Attrb1 = b) (Attrb85 > T) Then Classification = False

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy