ECS 271, Machine Learning: Homework #2 : (Updated)
Due: 15 April 2004

Please put the last 4 digits of your student ID on your answer sheets along with your name.

Instructor: Prof. Rao Vemuri, rvemuri@ucdavis.edu

 

(1)   (50 points) The following data is given to you. You are asked to do the following things

(a)    Build a decision tree and show the calculations on how you did it.

(b)   Once you have the tree, you are asked to develop logical rules describing the conditions that result in sunburn

(c)    Then simplify, if possible, the rule set and show the simplified rules.

Given Data

Independent Attributes / Condition Attributes

Dependent Attributes / Decision Attributes

 

Name

Hair

Height

Weight

Lotion

Result

Sarah

blonde

average

light

no

sunburned (positive)

Dana

blonde

tall

average

yes

none (negative)

Alex

brown

short

average

yes

none

Annie

blonde

short

average

no

sunburned

Emily

red

average

heavy

no

sunburned

Pete

brown

tall

heavy

no

none

John

brown

average

heavy

no

none

Katie

blonde

short

light

yes

none

 

Answer: A complete solution to this problem will appear elsewhere on the web site. First I will post the solution up to the building of the decision tree. A week later I will post the solution of the pruned tree.

 

 2. (20 points) Train a  two-input, one-output Perceptron to simulate the Boolean AND function. The x0 input is always 1.

Start with three randomly selected weights w0, w1, and w2. in the range (0, 1). Repeatedly adjust the weights using the Perceptron learning rule until you get convergence. You may have to present the inputs several times before you see convergence. You are welcome to write a small piece of code as it may be tedious to do this manually. Plot the resulting straightline in the x1-x2 plane. Repeat with another randomly selected intiial weight set. Plot the resulting separating line. Did you get the same line?