Welcome to CS909/CS429 MLAP in Term-2 2026! This webpage is the primary source of information for all updates, announcements and content for this module.
Announcements
We will be adding major announcements for the module below.
- Assignment-2 Announced 20 Feb 2026
- Students who might need to change group (due to mitigating circumstances or a timetable clash) should email queries to the relevant resource email account (DCS.UG.Support@warwick.ac.uk / DCS.PGT.Support@warwick.ac.uk) and NOT to module organisers/TAs.
Module Teaching Team
MLAP R2D2 — AI Learning Tutor
MLAP R2D2 (reflection, reasoning, dialogue, and design)Link opens in a new window is the AI learning tutor for this module. Lecture materials and supporting resources will be uploaded directly into R2D2, and students can interact with it to help clarify concepts, reflect on your understanding, and think through appropriate problem-solving approaches.
IMPORTANT NOTE:
MLAP R2D2 is provided as an optional learning support tool and is being run on a trial basis. It is not a source of authoritative guidance and may occasionally produce incomplete, inaccurate, or misleading responses. Use of R2D2 is entirely optional and undertaken at the student’s own discretion and responsibility. Neither the University of Warwick nor the module organisers accept responsibility for decisions, actions, or outcomes arising from its use. For official and definitive information on assessments, deadlines, regulations, and requirements, students must always refer to the module webpage, and official announcements. If you encounter issues, unexpected behaviour, or wish to provide feedback on R2D2, please report this via Moodle to the instructors.
Teaching Assistants
Communication between students and teaching team
Please use moodle for non-urgent communication. It is very difficult to monitor and respond to emails from individual students due to the large size of the class.
For questions about Logistics or other issues, the first point of contact is George Wright.
(Instructor time table)Link opens in a new window
Moodle & Other Links
We shall be using this Moodle page for discussion and Q/A in the module.Link opens in a new window
Students can Post QuestionsLink opens in a new window to the Moodle forum.
A series of Self Assessment QuestionsLink opens in a new window that you can answer for self-assessment/feedback are also available. As the goal of these questions is to encourage students to explore and self study, answers to all of these questions will not be provided but students are welcome to discuss them with the instructors in lab sessions or as questions in lectures or via Moodle.
Lecture CaptureLink opens in a new window is also accessible via Moodle.
Timetable
Lectures
|
Time
|
Location
|
| Monday 11:00 - 12:00 (Weeks 1–10) starting 12 Jan 2026 |
R0.21 |
| Wednesday 12:00 - 13:00 (Weeks 1–10) |
L3 |
| Friday 13:00 - 14:00 (Weeks 1–10) |
OC1.05 |
| Revision Lecture 28 April 2026 10am |
R0.21 |
Lab Sessions - Weeks 1–10
Each student has been allocated a lab session. Please check your scheduled lab session on Tabula and attend that. Please make sure that you attend the whole of an assigned lab session so that you do not miss attendance.
TA allocation to labs may change from week to week due to scheduling constraints. However, there should be at least one TA common between sessions.
Students who might need to change group (due to mitigating circumstances or a timetable clash) should email queries to the relevant resource email account (DCS.UG.Support@warwick.ac.uk / DCS.PGT.Support@warwick.ac.uk). The teaching staff wil only be able to sign-post to support teams.
|
Lab
|
Time
|
Location
|
TA Allocation (Week 1)
|
Teams Link
|
| Lab 1 |
Thursday 10:00 - 11:00 |
CS0.06
|
George & Boyang |
N/A |
| Lab 2 |
Thursday 10:00 - 11:00 |
CS0.01 |
Piotr & Yijie |
N/A |
| Lab 3 |
Thursday 11:00 - 12:00 |
CS0.06 |
George & Xiaoqing |
N/A |
| Lab 4 |
Friday 10:00 - 11:00 |
MB3.17 |
Yijie & Xiaoqing |
N/A |
| Lab 5 |
Friday 10:00 - 11:00 |
CS0.01 |
Boyang & Piotr |
N/A |
Coursework Assignments
Books and Other resources
[PML-1] Probabilistic Machine Learning: An Introduction by Kevin Patrick Murphy. MIT Press, 2021. link: https://probml.github.io/pml-book/book1.html
[PML-2] Probabilistic Machine Learning: Advanced Topics by Kevin Patrick Murphy. MIT Press, 2023. link: https://probml.github.io/pml-book/book2.html
[IML] Introduction to Machine Learning 3e by Ethem Alpaydin (selected chapters: ch. 1,2,6,7,9,10,11,12,13)
[DBB] Deep Learning by Ian Goodfellow, Yoshua Bengio, Aaron Courville, (Ch 1-5 if needed as basics), Ch. 6,7,8,9 link: https://www.deeplearningbook.org/Link opens in a new window
[FNN] Fundamentals of Neural Networks : Architectures, Algorithms And Applications by Laurene Fausett, (ch. 2,6)
[SODL] The Science of Deep Learning by Iddo Drori link: https://www.thescienceofdeeplearning.org/
Reading listLink opens in a new window
Casual Reading:
- The master algorithm
- The Alignment Problem
- The book of why
Course Materials
Slides and reading materials will be posted each week. The lab session will be available prior to the start of the lab session. Please see the Lab Access section below for guidance on running the lab materials and Python guides.
NOTE:
- It is strongly recommended that you attend all lectures in person as this year's lectures would be significantly different in content and delivery from previous ones and we cannot guarantee the sufficiency of archived content for effective learning in terms of success in coursework or examination. However, if you are unable to attend lectures due to a genuine issue, you can use the links to archived lecture recordings from previous years which are available at this Course Stream ChannelLink opens in a new window as well as on YoutubeLink opens in a new window (https://bit.ly/2RannLB )
Week
|
Lectures
|
Reading/Resources
|
Labs/Practicals
|
|
1
Jan 12
|
IntroductionLink opens in a new window
Why Data Science? [Self-Study]Link opens in a new window
Applications [Self-Study] Link opens in a new window
Research Applications [Self-Study]Link opens in a new window
FrameworkLink opens in a new window
SurveyLink opens in a new window
Classification and Linear DiscriminantsLink opens in a new window
Determining Linear SeparabilityLink opens in a new window
Prelim: Gradients and Gradient descent
Prelim: Gradient Descent CodeLink opens in a new window
Prelim: ConvexityLink opens in a new window
Perceptron ModelingLink opens in a new window
Perceptron CodeLink opens in a new window
What's special in an SVM?Link opens in a new window
SVM FormulationLink opens in a new window
A brief history of SVMsLink opens in a new window
Coding an SVM and CLink opens in a new window
Margin and RegularizationLink opens in a new window
Linear Discriminants and Selenite CrystalsLink opens in a new window
Selenite Crystals bend SpaceLink opens in a new window
Using transformations to foldLink opens in a new window
Transformations change distance and dot productsLink opens in a new window
Kernelized SVMs
|
Introduction and Philosophy SlidesLink opens in a new window
Applications and Framework Slides [Self-Study]Link opens in a new window
CRISPR Talk (optional, Self-Study)Link opens in a new window
k-Nearest Neighbor AlgorithmLink opens in a new window [Required]
[PML] Chapter-1, [IML] Chapter-1
Whole Slide Images are Graphs Talk (Self-Study)Link opens in a new window
A few useful things to know about machine learning (recommended self-study)Link opens in a new window
Preliminaries (notes)Link opens in a new window
Linear Discriminants (notes)Link opens in a new window
Building Linear Models (notes)Link opens in a new window
Gradient Descent Code (py)Link opens in a new window
Perceptron Code (py)Link opens in a new window
Perceptron AlgorithmLink opens in a new window
Perceptron Classification VideosLink opens in a new window
Perceptron NotesLink opens in a new window
SVM NotesLink opens in a new window
Fold and Cut TheoreomLink opens in a new window
Book Reading [SVM in PML, SVM in IML]
SVM TutorialLink opens in a new window
|
Learning Python
Understanding lines
Distance and Dot productLink opens in a new window
k-Nearest Neighbor DemoLink opens in a new window
Implementing kNN classifierLink opens in a new window
Understanding lines
Gradient Descent and PerceptronLink opens in a new window
Gradient Descent Code (py)Link opens in a new window
Perceptron Code (py)Link opens in a new window
Linear Discriminant for AND Classifcation ProblemLink opens in a new window
Regularized PerceptronLink opens in a new window
Transformations codeLink opens in a new window
Kernelized PredictorLink opens in a new window
SVM AppletLink opens in a new window
SVMLink opens in a new window
|
|
2
Jan 19
|
SVM lectures (above)
Tree based classifiers (By Ayse Sunar) on Mon 19 Jan
|
Trees, Random Forest and XGBoost Classifiers (By Dr Ayse Sunar)
SVM NotesLink opens in a new window
HistoKernel Paper (optional read)
HistoKernel Demo
Quantum Kernel SVM for cancer classificationLink opens in a new window (optional read)
|
From week 1: Preliminaries, Linear Classifiers, SVMs
kNN in High dimensionsLink opens in a new window
Trees and XGBoostLink opens in a new window
Performance Assessment ExerciseLink opens in a new window
|
|
3
Jan 26
|
SVM lectures (above)
Scientific MethodLink opens in a new window
Why measure performance?Link opens in a new window
Accuracy and its assumptionsLink opens in a new window
Confusion matrix and associated metricsLink opens in a new window
|
SVM TutorialLink opens in a new window
HistoKernel Paper (optional read)
HistoKernel Demo
Quantum Kernel SVM for cancer classificationLink opens in a new window (optional read)
|
Performance Assessment ExerciseLink opens in a new window
Work on Assignment-1
|
|
4
Feb 02
|
ROC CurvesLink opens in a new window
PR CurvesLink opens in a new window
PR-ROC Relationship and codingLink opens in a new window
Estimating GeneralizationLink opens in a new window
(with CV, bootstrap estimation and statistical significance)
CV in sklearnLink opens in a new window
4.10 Using MLXTend
Twelve ways to fool the massesLink opens in a new window
|
Performance assessment Lecture NotesLink opens in a new window
Performance Assessment DemoLink opens in a new window
Performance Assessment ExerciseLink opens in a new window
Ten ways to fool the masses with machine learningLink opens in a new window
MLXTENDLink opens in a new window
Chapter 19 “Design and Analysis of Machine Learning Experiments” Alpaydin, Ethem. 2010. Introduction to Machine Learning. Cambridge, Mass.: MIT Press.
Optional Reading
Wainberg, Michael, Babak Alipanahi, and Brendan J. Frey. “Are Random Forests Truly the Best Classifiers?” Journal of Machine Learning Research 17, no. 110 (2016): 1–5.
Munir, Farzeen, Sadaf Gull, Amina Asif, and Fayyaz Ul Amir Afsar Minhas. “MILAMP: Multiple Instance Prediction of Amyloid Proteins.” IEEE/ACM Transactions on Computational Biology and Bioinformatics, August 22, 2019. https://doi.org/10.1109/TCBB.2019.2936846.
|
Performance Assessment ExerciseLink opens in a new window
Performance Assessment DemoLink opens in a new window
Work on Assignment-1
|
|
5
Feb 09
|
PCA Prelim: Old MacDonald meets LagrangeLink opens in a new window
PCA Prelim: Meet stubborn vectorsLink opens in a new window
PCA Prelim: Covariance and its friend CorrelationLink opens in a new window
PCALink opens in a new window
SRM view of PCALink opens in a new window
|
PCA: Lecture NotesLink opens in a new window
(Review lab notebooks as well)
|
Lagrange Multipliers DemoLink opens in a new window
Eigen Vectors DemoLink opens in a new window
Eigen Values and Vectors LabLink opens in a new window
Information theoretic measuresLink opens in a new window
Understanding AssociationLink opens in a new window
PCA DemoLink opens in a new window
PCA Reconstruction DemoLink opens in a new window
PCA TutorialLink opens in a new window
Eigen FacesLink opens in a new window
Optional: Local PCA for Noise reductionLink opens in a new window
Permutation testing for finding noisy components
Incremental PCA
Robust PCALink opens in a new window
Kernelized PCALink opens in a new window
ICA TutorialLink opens in a new window
MDS TutorialLink opens in a new window
Manifold learning methods
UMAP ExplorerLink opens in a new window
PCA vs LDALink opens in a new window
Regression in sklearnLink opens in a new window
A deeper look into regression (you can skip the "Advanced" topics)Link opens in a new window
Using regression for causal discovery (advanced)
Work on Assignment-1
|
|
6
Feb 16
|
Other dimensionality reduction methodsLink opens in a new window: MDS, t-SNE, ICA, LDA, Factor Analysis, Topic Modeling with NMF, CCA (no-video recording)
Lecture 3:
Interim Module EvaluationLink opens in a new window
OLSRLink opens in a new window
OLSR to SVRLink opens in a new window
Application: Hurricane Intensity RegressionLink opens in a new window
How to go beyond classification, regression and dimensionality reductionLink opens in a new window
Applied SRM in Barebones PytorchLink opens in a new window
One class classifiers (Anomaly Detection)Link opens in a new window
RankingLink opens in a new window
Recommender SystemsLink opens in a new window
Still more problemsLink opens in a new window
ClusteringLink opens in a new window
Clustering in sklearnLink opens in a new window
Ensemble Methods and XGBoost
|
Dimensionality Reduction Lecture NotesLink opens in a new window
Lecture Notes (Regression)
Lecture Notes (SRM)Link opens in a new window
Finding Anti-CRISPR proteins with ranking (optional)Link opens in a new window
Using reinforcement learning to help a mouse escape a cat (optional)Link opens in a new window
Detailed Discusison on Causality and Domain Generalization (Optional)Link opens in a new window
(Review lab notebooks as well)
|
PCA Reconstruction DemoLink opens in a new window
Incremental PCA
Robust PCALink opens in a new window
Kernelized PCALink opens in a new window
ICA TutorialLink opens in a new window
MDS TutorialLink opens in a new window
Manifold learning methods
UMAP ExplorerLink opens in a new window
PCA vs LDALink opens in a new window
Regression in sklearnLink opens in a new window
A deeper look into regression (you can skip the "Advanced" topics)Link opens in a new window
Using regression for causal discovery (advanced)
Regression:
Regression in sklearnLink opens in a new window
A deeper look into regression (you can skip the "Advanced" topics)Link opens in a new window
Using regression for causal discovery (advanced)
|
|
7
Feb 23
|
Other machines (from Week 6 continued)
NOTE: 2025 lectures are different from previous recordings below and students are recommended to use Lecture CaptureLink opens in a new window accessible via Moodle. Previous recording are available for additional support esp. for re-sit students.
Let me pick your brainLink opens in a new window
Single Neuron ModelLink opens in a new window
Multilayer PerceptronLink opens in a new window
Let's play with a neural networkLink opens in a new window
Deriving Backpropagation algorithm for MLPsLink opens in a new window
MLP in KerasLink opens in a new window
MLP in PyTorch using NN moduleLink opens in a new window
MLP in PyTorch for MNIST with DataloadersLink opens in a new window
Under the hood view of deep learning librariesLink opens in a new window
Improving learning of MLPsLink opens in a new window
|
Lecture Notes (SRM)Link opens in a new window
Deep Learning NotesLink opens in a new window
|
Understanding REO and SRM:
Applied SRM in Barebones PytorchLink opens in a new window
Barebones Linear ModelsLink opens in a new window
Other ML problems:
ClusteringLink opens in a new window
Trees and XGBoostLink opens in a new window
Multilayer Perceptron:
Keras BarebonesLink opens in a new window
NN module in PytorchLink opens in a new window
MNIST MLP in PyTorchLink opens in a new window
- Solve the XOR using a single hidden layer BPNN with sigmoid activations
Work on Assignment-2
Universal Approximation CodeLink opens in a new window
|
Lab Access and Machine Requirements
Remote Machine Login: https://warwick.ac.uk/fac/sci/dcs/intranet/user_guide/remote-login/
Use "module load cs909-python". The following guide will show you how to run jupyter-notebook as well as loading using notebooks with the module loaded in Visual Studio Code:
Notebook and VSCode Setup GuideLink opens in a new window
If using your own machine, you will be needing the listed libraries.
- Anaconda Python (3.6+)
- Jupyter Notebook or Jupyter Lab
- Matplotlib
- Numpy
- Scipy
- Pandas
- Scikit-learn
- Keras, PyTorch and TensorFlow (with GPU configuration if GPUs available)
Learning Python
The following resources may be useful when familiarising yourself with Python.
Python Documentation: https://docs.python.org/3/tutorial/index.html
NumPy Website: https://numpy.org
Matplotlib Website: https://matplotlib.org
Video Tutorials
Courtesy of Dr. Greg WatsonLink opens in a new window
- Introduction
- Basic Variables
- Lists
- Control Flow
- Functions
- Tuples
- Sets
- Dictionaries
- NumPy
- Matplotlib
- Classes