1
play

1 Real-valued features Non-binary class variable Noise and - PDF document

CS 486/686 Lecture 20 Extending Decision Trees 3: If no examples left, return a leaf node with the majority decision of the examples in the parent node. (3) (2) 1 (1) Calculation of information gain: end for 9: 8: 7: 6: 5: 4: else (CQ)


  1. CS 486/686 Lecture 20 Extending Decision Trees 3: If no examples left, return a leaf node with the majority decision of the examples in the parent node. (3) (2) 1 (1) Calculation of information gain: end for 9: 8: 7: 6: 5: 4: else (CQ) 2: If no features left, return a leaf node with the majority decision of the examples. Non-binary class variable 1 • Real-valued features • Non-binary class variable • Noise and overfjtting 1.1 Extending Decision Trees So far, the class variable is binary (Tennis is Yes or No). What if there are more than two classes? The modifjed ID3 algorithm: Algorithm 1 ID3 Algorithm (Features, Examples) Suppose class is in c 1 , . . . , c l . 1: If all examples belong to the same class i , return a leaf node with decision i . choose feature f with the maximum information gain for each value v of feature f do add arc with label v add subtree ID 3( F − f, s ∈ S | f ( s ) = v ) Consider feature A with c i examples in class i , i = 1 , . . . , l . For j = 1 , . . . , k , if A takes value v j , then there are c j i examples in class i . Gain ( A ) ( c 1 c l ) = I , . . . , c 1 + · · · + c l c 1 + · · · + c l ( ) k c j 1 + · · · + c j c j c j ∑ l 1 l − I , . . . , c j 1 + · · · + c j c j 1 + · · · + c j c j 1 + · · · + c j j =1 l l l

  2. CS 486/686 Lecture 20 Extending Decision Trees 9 Weak Normal 23.9 Rain 10 Yes Weak Normal 20.6 Sunny No 11 Weak High 22.2 Sunny 8 Yes Strong Normal 17.7 Overcast 7 Yes Sunny Strong Normal Solution 1: Discretize the values. How should we deal with real-valued features? No Strong High 21.7 Rain 14 Yes Weak 27.2 23.9 Overcast 13 Yes Strong High 22.2 Overcast 12 Yes Strong Normal No Normal 2 1 26.6 Sunny 18.3 No Weak High 29.4 Sunny Tennis? Strong Wind Humidity Temp Outlook Day For example, Jeeves could have recorded the temperature as a real-valued feature. Real-world problems often have real-valued features. Real-valued features 1.2 High 2 No Yes Rain 6 Yes Weak Normal 3 Rain 5 20.0 Weak High High Overcast 28.3 • Temp < 20.8 –> Cool Weak Yes 4 Rain 21.1 Suppose that we are classifying examples into three classes. Before testing feature X , there are 3 examples in class c 1 , 5 examples in class c 2 , and 2 examples in class c 3 . Feature X has two values a and b . When X = a , there are 1 examples in class c 1 , 5 examples in class c 2 , and 0 examples in class c 3 . When X = b , there are 2 examples in class c 1 , 0 examples in class c 2 , and 2 examples in class c 3 . What is the information gain for testing feature X at this node? I (3 / 10 , 5 / 10 , 2 / 10) = 1 . 485 6 / 10 ∗ I (1 / 6 , 5 / 6 , 0 / 6) + 4 / 10 ∗ I (2 / 4 , 0 / 4 , 2 / 4) = 6 / 10 ∗ 0 . 65 + 4 / 10 ∗ 1 = 0 . 79 Information gain is 1 . 485 − 0 . 79 = 0 . 695 .

  3. CS 486/686 Lecture 20 Extending Decision Trees 1. Sort the instances according to the real-valued feature the largest gain. 3. Determine the information gain for each possible split point and choose the split point with 3 2. Possible split points are the values that are midway between two values that difger in their classifjcation. the discretization we pick makes the decision tree much more complicated? Advantage is that this is easy to do. Disadvantage is that we lose valuation information. What if First, let’s sort the data set according to the values of Temp. The sorted data set is below. • 20.8 ≤ Temp < 25.0 –> Mild • 25.0 ≤ Temp –> Hot Solution 2: Dynamically choose a split point c for a real-valued feature Split a feature f into f < c and f ≥ c . How should we choose the split point c ? (a) Suppose that the feature changes its value from X and Y . Follow the steps below to determine whether we should consider ( X + Y ) / 2 as a possible split point. (b) For all the data points where the feature takes the value X , gather all the labels into the set L X . For all the data points where the feature takes the value Y , gather all the labels into the set L Y . (c) If there exists a label a ∈ L X and a label b ∈ L Y such that a ̸ = b , then we will consider ( X + Y ) / 2 as a possible split point.

  4. CS 486/686 Lecture 20 Extending Decision Trees High 28.3 Overcast 3 Yes Weak Normal 27.2 Overcast 13 No Strong 26.6 Weak Sunny 2 Yes Strong Normal 23.9 Sunny 11 Yes Weak Normal High Yes 4 described above tries to go through fewer split point values by only choosing split points for which possible split point (because No for 22.2 is difgerent from Yes for 23.9.) the two data points with 23.9 are both Yes. We will consider (22.2 + 23.9) / 2 = 23.05 as a • The classifjcation for 2 data points with 22.2 are No and Yes, whereas the classifjcation for point (because No for 21.7 is difgerent from Yes for 22.2.) are Yes, and No. In this case, we will consider (21.7 + 22.2) / 2 = 21.95 as a possible split • The classifjcation for 21.7 is No, whereas the classifjcation for the 2 data points with 22.2 NOT consider the midway value between these two as a possible split point. • The classifjcation for 20.6 and the classifjcation for 21.1 are both Yes. Therefore, we will consider (17.7 + 18.3) / 2 = 18 as a possible split point. • The classifjcation for 17.7 is Yes, whereas the classifjcation for 18.3 is No. Thus, we will the two values difger in their classifjcation. Here are a few examples: However, the procedure 1 We could choose to test all such midway values as the split points. • (22.2 + 23.9) / 2 = 23.05 • (17.7 + 18.3) / 2 = 18 example, here are a few possible split points. For Whenever the value of Temp changes in its sorted order, we have a possible split point. No Weak High 29.4 Sunny 23.9 Rain 10 6 Yes Weak Normal 20.0 Rain 5 No Strong Normal 18.3 Rain Yes Yes Strong Normal 17.7 Overcast 7 Tennis? Wind Humidity Temp Outlook Day 9 Sunny 20.6 Normal Strong High 22.2 Overcast 12 No Weak High 22.2 Sunny 8 No Strong High Weak Yes 4 Rain 21.1 High Weak Yes 14 Rain 21.7 Following the procedure described above, you should derive 8 possible split points. Here is an example for calculating the information gain for the split point c = 21 . 7+22 . 2 = 21 . 95 . 2

  5. CS 486/686 Lecture 20 Extending Decision Trees High Rain 10 Yes Weak Normal Cool Sunny 9 No Weak Mild Normal Sunny 8 Yes Strong Normal Cool Overcast 7 No Strong Mild Weak Cool 13 Strong High Mild Rain 14 Yes Weak Normal Hot Overcast Yes Yes Strong High Mild Overcast 12 Yes Strong Normal Mild Sunny 11 Normal Rain 5 Outlook Weak High Hot Sunny 1 Tennis? Wind Humidity Temp Day 2 to No. Training examples may be misclassifjed. For example, suppose that the class of Day 3 is corrupted Noise and overfjtting 1.3 This means that we will have larger trees and the trees may be diffjcult to understand. • Any real-valued feature can be tested many times. • Any discrete feature is tested at most once. Additional complication: On any path from the root to a leaf Repeat this for all real-valued features and all split points. • Temp: +: 3, 4, 5, 7, 9, 10, 11, 12, 13. -: 1, 2, 6, 8, 14. 6 No Sunny Hot Yes Weak Normal Cool Rain 5 Yes Weak High Mild Rain 4 No Weak High Hot Overcast 3 No Strong High No • Temp < 21 . 95 : +: 4, 5, 7, 9. -: 6, 14. • Temp ≥ 21 . 95 : +: 3, 10, 11, 12, 13. -: 1, 2, 8. Gain (Temp < 21 . 95 ) = I(9/14, 5/14) - [6/14 I(4/6, 2/6) + 8/14 I(5/8, 3/8)] = 0.00134

  6. CS 486/686 Lecture 20 Extending Decision Trees No Yes Wind No Yes Wind Yes Sunny No Overcast Normal High Rain What are the test errors for these two trees? • With subtree replaced by a leaf node with Yes: 0 errors. (Recall that the fjrst tree classifjes all examples in the test set perfectly.) 6 Humidity Yes Humidity Outlook Humidity Yes No Yes Wind Yes • With subtree: 2 errors – This is overfjtting! No Sunny We need to expand the sub-tree under Outlook = Overcast. Result: The tree becomes more complicated just to handle one outlier. –> Overfjtting. Outlook Overcast Rain Weak Strong Normal High Weak Strong Normal High Weak Strong

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend