ECS 271

Vemuri, Rao

3

Machine Learning and Discovery

R

TR 12:10p-1:30p

146 Robbins??

25

ECS 271, Machine Learning ; Fall 2005
CRN:  ?????

Score card

Instructor: Prof. Rao Vemuri, rvemuri@ucdavis.edu

Lecture Times:   12.10 – 1.30 PM - Tu-Th  

Office Hours: (in Davis)  By appointment

Office Location: 236 Walker Hall

Lecture Hall: ???

Prerequisites
1.)
Graduate standing in the College of Engineering or permission of instructor
2.) A First course in probability and statistics such as Stat 131 A.
3.)
A background in AI (ECS 170) will make this course easier, but such a background is not essential. Students who took ECS 170 or an equivalent will have a decided advantage as some of the topics will be repeated here, at a faster pace and at a greater depth.
4.) As graduate students you are expected to be good at programming skills (C, C++ or Java, LISP,  Prolog)
Please talk to me if you have any concerns.

Text Book

Tom M. Mitchell, Machine Learning, McGraw-Hill
ISBN.
0-07-042807-7

Grading

40% for a Project , 60% for Homework and Exams  (sample exam )

There will be several homework assignments (approx. one set per week), one midterm and one final.

Project: 40% (Due on the last day of classes)

Midterm: 30%

Final: 10% (Take home. Here, you read and grade two project reports prepared by two other students and turn  in your evaluations in 24 hours)

Homework sets: 20%

Your Grades

Course Description .

The field of Machine Learning is concerned with the issue of constructing computer programs that automatically improve with experience.  Machine learning draws on concepts from many fields, including statistics, artificial intelligence, cognitive theory, computational complexity and control theory. The goal of this course is to present key algorithms and theory that form the core of machine learning with a balanced presentation of both theory and practice.

A combination of analytical skills and programming skills is expected of the students wishing to enroll in this class. The project is essentially an implementation of one or two algorithms discussed in the class.
 

TOPIC OUTLINE

The topics for the last two weeks will be decided based on what a majority of the students would like to see covered
The postscript files are viewgraphs supplied by the author of the text book

1. Introduction, Read Chapter 1, slides (pdf)
Lecture 0 Slides
           Organization of the course
           Term paper, homework and Exam
           Well-posed learning systems

Deciding to play tennis or not – A model problem
           Perspectives and issues in machine learning

Homework#1

2. Decision Tree Learning, Read Chapter 3,   slides (pdf)
Lecture 3 Slides
            Decision Trees – Overview

Decision Trees - Construction
            Decision Trees - Pruning

Homework#2  Do problems on Decision Tables. Build a decision tree, prune it and generate rules out of it.

Solution to HW2 problem, Part 1: Decision tree construction part

Solution to HW2 Problem, Part 2: Decision tree pruning part

3. Artificial Neural Networks, Chapter 4,
Lecture 4 Slides

            Perceptron Learning

Learning via Multi-layer feed forward networks

Perceptron Learning and Support Vector Machines

 

Homework#3 Problems on activation functions (pdf file). Posted on 16th April, Due on 22 April 2004

 

Homework#4 Problem on back propagation and couple of theory problems.  Posted on 19th April, Due on 29 April 2004

 

4. Computational Learning Theory

Intro to PAC learning

PAC-learnability
            Intro to VC Dimension

Homework#5 Problems on PAC-learning and VC-dimension.  Posted on 30th April, Due on 6 May 2004

Hw#3. Do a probelm on Hypotheses spaces and PAC learning.

Solution to HW#3

hw4.htm Do Perceptron problem, Shattering Exercises and BP

Solution to Hw4 – shattering question only

5. Bayesian Learning, Chapter 6,
 

    Bayes Classifiers

     Bayesian Belief Networks

Solution to Mid-term Examination- Spring 2004

Homework#6 Problems on Probability and Bayesian nets. Due on 27 May 2004

6. Evolutionary Learning, Chapter 9
      Genetic Algorithms – short tutorial
      Genetic Programming – short tutorial

Homework#7 Problems on Probability and Genetic Algorithms

4. Concept Learning, Read Chapter 2, slides (pdf)
Lecture 1 Slides – An Overview of Learning Problems

Lecture 2 Slides
        Concept learning as search
        Version spaces
        Inductive bias

Workedout example on Version spaces



8. Instance-Based Learning , Chapter 8,
Week  9

     k-nearest neighbor learning
     Radial basis functions

9. Reinforcement learning

Part 1.

Part 2. TD learning

Useful Sites

UCI

CMU

AAAI

JMLR

Face Recognition Home Page

Support Vector Machines:

 

Department of Computer Science

UC Davis

Last updated on 25March 2003