CS 457/557: Computational Intelligence

Spring 2016

Meeting Times

Lect: 1:00 - 1:50 M, Tu, Wed, HB 116

Labs: 1:00 - 1:50 Th, HB 206


Dr. Razvan Andonie, HB 214-E, Office hours


Stephen Marsland, Machine Learning: An Algorithmic Perspective, CRC Press, Second Edition, 2015



Introducing concepts, models, algorithms, and tools for development of intelligent systems. Example topics include artificial neural networks, genetic algorithms, fuzzy systems, swarm intelligence, ant colony optimization, artificial life, and hybridizations of the above techniques. We will look at these techniques from a machine learning perspective. This domain is called Computational Intelligence, and is a numerical interpretation of biological intelligence.

Topics & Student Learning Outcomes

Neural networks, associative memories, vector quantization, self-organizing feature maps, support vector machines, genetic algorithms, fuzzy neural networks, swarm intelligence, ant colony optimization, decision trees, ensemble learning, nearest neighbor method, Gaussian mixture methods, principal component analysis, independent components analysis, hill climbing, reinforcement learning, Markov, decision processes, simulated annealing, hidden Markov models, Bayesian networks.

On the completion of this course, the student will have:

An understanding of fundamental computational intelligence and machine learning models.

Implemented neural networks, genetic algorithms, and other computational intelligence and machine learning algorithms.

Applied computational intelligence and machine learning techniques to classification, prediction, pattern recognition, and optimization problems.


There are no exams. Grading will be based on 50% assignments and 50% Final Project.

Grade Distribution:

95 - 100   A

90 - 94     A-

87 - 89     B+

83 - 86     B

80 - 82     B-

77 - 79     C+

73 - 76     C

70 - 72     C-

67 - 69     D+

63 - 66     D

60 - 62     D-

0  - 59      F

Final Project

The Final Project should be submitted as a technical report and presented in class. The CS 557 projects are more complex and they will be graded with higher expectations. The project will be graded as follows: 1/3 code, 1/3document, and 1/3 presentation. The report document should contain: problem description, benchmarking, and discussion. Guidelines can be found at here. The presentation should be made with 5-7 slides, in about 10 minutes, including questions and answers. Presentation guidelines can be found here. Submit the report in a zipped folder containing the code and the technical report.

Here is the list of projects from previous years.

Course Materials

The Python code and the datasets from the textbook can be found here. Here is a nice neural network simulation website. It lets you build, simulate and visualize a basic neural network during training.


You are encouraged to use Python and Weka. The examples in the textbook are written in Python and can be downloaded from here. To install Python, use the WinPython installation package. Use the included Spyder scientific development environment. You may install the 32 or 64 bit version, but make sure you install Python 2 (not 3), because the examples in the book are in Python 2. It is a good idea to install WinPython on your computer or on a flash-drive, because in the lab you do not have administrator privileges to initialize Spyder. Good links for scientific programming in Python are can be found at Numeric & Scientific Computation with Python.

Weka is excellent for benchmarking and testing. Starting with Weka 3.7, you will have the package manager, which gives you access to many useful packages. You may use the following Weka books:

Witten, I. H., Frank, E., Hall, M. A. Data Mining; Practical Machine Learning Tools and Techniques, 2011.

Kaluza, B. INSTANT Weka How-to, Packt Publishing, 2013.

WinPython and Weka are installed in our labs. Other free computational intelligence software are:

Stuttgart Neural Simulator

Computational Intelligence Library (CIlib)

Java Object Oriented Neural Engine (JOONE)

ECJ 20: A Java Based Evolutionary Computation and Genetic Programming Research System

LIBSVM – A Library for Support Vector Machines


Keras: Deep Learning library for Theano and TensorFlow

Datasets for benchmarking:

UCI Machine Learning Repository

UCI Weka datasets in arff format

Free Additional Reading

Rojas, P. Neural Networks – A Systematic Introduction, Springer, 1996.

Hastie, T., Tibshirani, R., and Friedman, J. The Elements of Statistical Learning, Springer, 2009.

Goodfellow, I, Bengio, Y., and Courville, A. Deep Learning, MIT Press.

Course Schedule






General Presentation



Introduction, Python for machine learning

Ch 1 & Appendix A


Weka Explorer

Weka documentation


Weka: Experimenter and KnowledgeFlow

Preliminaries in Machine Learning

Ch 2



Overview of Neural Networks

Ch 2 & 3



Learning in NN: unsupervised/supervised, classification/regression, learning rules (Hebb, perceptron, delta, winner-takes-all), examples

Ch 3

Start HW 1 (graded)


Perceptron Convergence Theorem

Ch 3



Multi-Layer Perceptron, Backpropagation

Ch 4

HW 1 due

Start HW 2 (graded)


Universal Approximation Theorem, Time-Series Prediction

Ch 4


Data Pre-Processing

Ch. 4


Training, Testing, and Validation

Ch. 4

HW 2 due

Start HW 3 (graded)


Evolutionary Learning, Genetic Algorithms

Ch. 10



Genetic Algorithms, The Fundamental Theorem of Genetic Algorithms

Ch. 10



Unsupervised Learning: k-Means, LVQ

Ch. 14

HW 3 due

Start HW 4 (graded)


Unsupervised Learning: SOM

Ch. 14


Dimensionality Reduction & Feature Selection

Ch. 6


Dimensionality Reduction: PCA

Ch. 6


Radial Basis Function Network

Ch. 5


Support Vector Machines (SVM)

Ch. 8




Swarm Intelligence & Ant Colony Optimization

Dorigo and Parpinelli papers

HW 4 due

Start HW 5 (graded)


Optimization and Search, Levenberg-Marquardt, Simulated Annealing

Ch. 9


Optimization and Search, Levenberg-Marquardt, Simulated Annealing

Ch. 9



SOURCE presentations (12:40 – 2:00) – no class



Probability and Learning: Naοve Bayes, EM Algorithm, k-Nearest Neighbor

Ch. 7



Convolutional Neural Networks

Ch. 9 from Deep Learning



Symmetric Weights and Deep Belief Networks: Hopfield NN & Boltzmann Machine

Ch. 17

Meyder & Kiderlen paper

HW 5 due


Memorial Day

No classes

Memorial Day


Symmetric Weights and Deep Belief Networks: Deep Learning

Ch. 17



Deep Learning Implementations


Presentations of Final Projects (noon – 2:00)



Laboratory Schedule






Use WinPython for Practice Questions (not graded)



Run Weka examples

(not graded)


Character Recognition

HW 2



HW 3



HW 4


GA Optimization

HW 4


GA for NN and Games

HW 5


Final Project

HW 5


Final Project


Final Project


Honor Code: All work turned in for credit, including exams and all components of the project, are to be the work of the student whose name is on the exam or project. For all project components, the student can receive assistance from individuals other than the instructor only to ascertain the cause of errors. Thus you can get help if you need it to figure out why something doesn't work. You just can't get help from anyone, other than the instructor or TA, to figure out how to make something work. All solutions turned in for credit are to be your individual work and should demonstrate your problem solving skills, not someone else's. Help each other understand and debug the programming assignments. However, you should write the code for your programs yourself. Writing it yourself is the only way you will learn. Since everyone is writing their own code, no two programs should be the same or so similar that I could convert one to the other by a simple mechanical transformation (e.g. changing variable names and comments). I consider this plagiarism and a violation of academic code. The following text should appear on all assignments: “I pledge that I have neither given nor received help from anyone other than the instructor for all program components included here.”

First violation: Students must meet with the instructor. In most cases, the grade will be split between the authors of the copied programs. Second violation: Students will receive no credit for the assignment, an incident letter will be placed on file in the Computer Science Department, and the matter referred to the Computer Science Department Chair.

Class Attendance: Class attendance is expected and recorded.

ADA Statement: Students with disabilities who wish to set up academic adjustment in this class should give me a copy of their "Confirmation of Eligibility for Academic Adjustment" from the Disability Support Services Office as soon as possible so we can discuss how the approved adjustment will be implemented in this class. Students without this form should contact the Disability Support Services Office, Buillon 205 or or 963-2171.

Caveat: The schedule and procedures for this course are subject to change. It is the student's responsibility to learn of and adjust to changes.