CS2411 Subsymbolic Artificial Intelligence and Neural Networks
20042005
FAQ
Announcements
Course
Timetable
Handouts
Lab Information
Example
Class Sheets
Reading Lists
Links
Ask a Question

Ask a question
or search the FAQ. Click the "email" button on the FAQ page to
submit a question to the FAQ. Or browse the questions already asked (there
is not much on it as yet.)
Announcements
 11/1/05  A sheet giving some exam
advice is available.
 30/9/04  Monday lab and examples class sessions will not
run, (unless class size increases greatly).
 30/09/04  Welcome
 Course starts today.
Lab and Example Class Timetable
Course Handouts

First day handout:

General information
about the course
 Neural Networks I:
General aspects of neural networks
and simple perceptrons, as

gzipped
Postscript or as
PDF.
 Generalization and Testing :
Elements of machine
learning, and how to test a learned system. As

gzipped
Postscript or as
PDF.
 Neural Networks II:
Multilayer perceptrons and gradient descent learning.
As

gzipped
Postscript or as
PDF.
 Slides from a lecture on applications of MLPs

as PDF .
 Neural Networks III:
Improving performance of neural networks.
As

gzipped
Postscript or as
PDF.
 Probability I:
A review of probability and Bayes Rule.

as gzipped Postscript or as PDF
.
 Bayesian Classification:
Bayesian Classification and
Probability Estimation.

as Postscript or as PDF
.
 Document Classification:
Bayesian classification as
applied to document classification. Also includes a section on online
learning of probability estimates.

as Postscript or as PDF
.
 Decision Theory:
Given the probabilities of the
classes, what is the best classification to make. Includes discussion
of risk, and ROC curves.

as Postscript or as PDF
.
 Search I
 The general problem of search, greedy search and hillclimbing
as postscript
or .
as PDF .
 Search II
 Genetic Algorithms
as postscript or .
as PDF .
Lab Information

Matlab Tutorial:

Matlab
Tutorial
in PDF or
Matlab Tutorial
in Postscript.
This tells gives an introduction to Matlab. Work through this before
the first lab. There are problems in the tutorial, but they are not a
part of the lab; they are just to help you learn Matlab. They are
optional.

Lab 1: Introduction to Matlab and Classification

Description of
Lab 1

Lab 2: Classification using Multilayer Perceptrons

Instructions for
Lab 2

Lab 3: Bayesian Spam Filter

Commands for Lab 3 in PDF
Note: Lab 3 is a two session lab.
Hints for Lab 3

Lab 4: Genetic Algorithms

Commands for Lab 4 in PDF
Examples Classes
Questions

First Examples Class questions This is a sheet of the questions
only.
 Second
Examples Class questions
 Third
Examples Class questions
 Fourth
Examples Class questions
 Fifth
Examples Class questions
Answers
 Answers to the first Examples Class questions as PDF
or as postscript
.
 Answers
to second Examples Class questions
 Answers
to third Examples Class questions
 Answers
to fourth Examples Class questions
 Answers
to fifth Examples Class questions
Reading Lists by Topic
A reading list by topic is here.
Links

A page on Multilayer Perceptrons
 from Jim
Marshall at Pomona College. Includes backpropagation and applications.
 An
Introduction to Neural Networks. An online book by Krose and
van der Smagt.
 Look at section 3.1 and 3.2 on perceptrons, and chapter 4 on
backpropagation (but we need not worry about the mathematical details
of back propagation. Focus on the performance details.)

Ant Colony Optimization
Applet

I mentioned ant colony optimization for the TSP in the first lecture as
an example of the subsymbolic approach to AI. Here is a link to a Applet
created by Mark C. Sinclair showing this running on a very simple
problem.

Elastic Net Applet

The elastic net was mentioned in the first lecture as
an example of the subsymbolic approach to AI. Here is a link to a Applet
created by Alexander Budnik and Tanya Filipova showing this running
on a very simple problem.