Welcome to CS909/CS429 Data Mining in Term-2 of 2024! This webpage is the primary source of information for all updates, announcements and content for this module.
Announcements
We will be adding an major announcements for the module below.
Module Teaching Team
Instructor: Fayyaz MinhasLink opens in a new window
Module Coordinator: Saif AnwarLink opens in a new window
Teaching Assistants
Communication between students and teaching team
Please use moodle for non-urgent communication. It is very difficult to monitor and respond to emails from individual students due to the large size of the class.
For questions about Logistics or other issues, the first point of contact is Saif Anwar.
Moodle Link (For QA & Discussions)
We shall be using this moodle page for our moduleLink opens in a new window.
Students can Post your QuestionsLink opens in a new window to the moodle forum.
The instructors will also be posting a series of questions each week under Self Assessment QuestionsLink opens in a new window that you can answer for self-assessment and feedback. As the goal of these questions is to encourage students to explore and self study, answers to all of these questions will not be provided but students are welcome to discuss them with the instructors in lab sessions or as questions in lectures or via Moodle.
Timetable
Lectures - Weeks 1–10
Monday: 11am, R0.21
Thursday: 12pm, Lecture Theatre 3
Friday: 1pm, OC1.05
2025>
LEC Mon 11:00-12:00 (R0.21) wks 15-24 | LEC Thu 12:00-13:00 (L3) wks 15-24 | LEC Fri 13:00-14:00 (OC1.05) wks 15-24 |
PRA Thu 10:00-11:00 (CS_CS0.01) wks 15-24 | PRA Thu 10:00-11:00 (CS_CS0.06) wks 15-24 | PRA Thu 11:00-12:00 (CS_CS0.01) wks 15-24 | PRA Thu 11:00-12:00 (CS_CS0.06) wks 15-24 | PRA Fri 10:00-11:00 (CS_CS0.01) wks 15-24 | PRA Fri 10:00-11:00 (CS_MB3.17) wks 15-24 |
Time
|
Location
|
Monday 11:00 - 12:00 |
R0.21 |
Thursday 12:00 - 13:00 |
L3 |
Friday 13:00 - 14:00 |
OC1.05 |
Lab Sessions - Weeks 1–10
Each student has been allocated a lab session. Please check your scheduled lab session on Tabula and attend that. Please make sure that you attend the whole of an assigned lab session so that you do not miss attendance.
TA allocation to labs may change from week to week due to scheduling constraints. However, there should be at least one TA common between sessions.
NOTE: The following information may change if there are changes to face-to-face teaching.
Students who might need to change group (due to mitigating circumstances or a timetable clash) should email queries to the relevant resource email account (DCS.UG.Support@warwick.ac.uk / DCS.PGT.Support@warwick.ac.uk). The teaching staff wil only be able to sign-post to support teams.
Lab
|
Time
|
Location
|
TA Allocation (Week 1)
|
Teams Link
|
Lab 1 |
Thursday 10:00 - 11:00 |
MB3.17
|
Manahil & Gozde |
N/A |
Lab 2 |
Thursday 10:00 - 11:00 |
CS0.01 |
Da & Piotr |
N/A |
Lab 3 |
Thursday 11:00 - 12:00 |
MB3.17 |
Manahil & George |
N/A |
Lab 4 |
Thursday 11:00 - 12:00 |
CS0.01 |
Da & Piotr |
N/A |
Lab 5 |
Friday 10:00 - 11:00 |
MB3.17 |
Saif & Gozde |
N/A |
Lab 6 |
Friday 10:00-11:00 |
CS0.01 |
George & Jack |
N/A |
Coursework Assignments
Books and Other resources
[PML-1] Probabilistic Machine Learning: An Introduction by Kevin Patrick Murphy. MIT Press, 2021. link: https://probml.github.io/pml-book/book1.html
[PML-2] Probabilistic Machine Learning: Advanced Topics by Kevin Patrick Murphy. MIT Press, 2023. link: https://probml.github.io/pml-book/book2.html
[IML] Introduction to Machine Learning 3e by Ethem Alpaydin (selected chapters: ch. 1,2,6,7,9,10,11,12,13)
[DBB] Deep Learning by Ian Goodfellow, Yoshua Bengio, Aaron Courville, (Ch 1-5 if needed as basics), Ch. 6,7,8,9 link: https://www.deeplearningbook.org/Link opens in a new window
[FNN] Fundamentals of Neural Networks : Architectures, Algorithms And Applications by Laurene Fausett, (ch. 2,6)
[SODL] The Science of Deep Learning by Iddo Drori link: https://www.thescienceofdeeplearning.org/
Casual Reading:
- The master algorithm
- The Alignment Problem
- The book of why
Course Materials
Slides and reading materials will be posted each week. The lab session will be available prior to the start of the lab session. Please see the Lab Access section below for guidance on running the lab materials and Python guides.
NOTE:
- It is strongly recommended that you attend all lectures in person as this year's lectures would be significantly different in content and delivery from previous ones and we cannot guarantee the sufficiency of archived content for effective learning in terms of success in coursework or examination. However, if you are unable to attend lectures due to a genuine issue, you can use the links to archived lecture recordings from previous years which are available at this Course Stream ChannelLink opens in a new window as well as on YoutubeLink opens in a new window (https://bit.ly/2RannLB ).
Week
|
Lectures
|
Reading/Resources
|
Labs
|
1
(Jan 8)
|
IntroductionLink opens in a new window
Why Data Science?Link opens in a new window
ApplicationsLink opens in a new window
Research ApplicationsLink opens in a new window
FrameworkLink opens in a new window
SurveyLink opens in a new window
Project SuggestionsLink opens in a new window
Classification and Linear DiscriminantsLink opens in a new window
Determining Linear SeparabilityLink opens in a new window
Prelim: Gradients and Gradient descent
Prelim: Gradient Descent CodeLink opens in a new window
Prelim: ConvexityLink opens in a new window
Perceptron ModelingLink opens in a new window
Perceptron CodeLink opens in a new window
|
Introduction SlidesLink opens in a new window
Applications and Framework SlidesLink opens in a new window
k-Nearest Neighbor AlgorithmLink opens in a new window [Required]
[PML] Chapter-1, [IML] Chapter-1
CRISPR Talk (optional)Link opens in a new window
Whole Slide Images are Graphs Talk (optional)Link opens in a new window
A few useful things to know about machine learning (recommended)Link opens in a new window
Linear Discriminants (notes)Link opens in a new window
Preliminaries (notes)Link opens in a new window
Building Linear Models (notes)Link opens in a new window
Gradient Descent Code (py)Link opens in a new window
Perceptron Code (py)Link opens in a new window
Perceptron AlgorithmLink opens in a new window
Perceptron Classification VideosLink opens in a new window
|
Learning Python
Implementing kNN classifierLink opens in a new window
Gradient Descent and PerceptronLink opens in a new window
|
2
Jan 15
|
What's special in an SVM?Link opens in a new window
SVM FormulationLink opens in a new window
A brief history of SVMsLink opens in a new window
Coding an SVM and CLink opens in a new window
Margin and RegularizationLink opens in a new window
Linear Discriminants and Selenite CrystalsLink opens in a new window
Selenite Crystals bend SpaceLink opens in a new window
Using transformations to foldLink opens in a new window
Transformations change distance and dot productsLink opens in a new window
Kernelized SVMs
|
SVM NotesLink opens in a new window
SVM AppletLink opens in a new window
Regularized PerceptronLink opens in a new window
Transformations codeLink opens in a new window
Fold and Cut TheoreomLink opens in a new window
Book Reading [SVM in PML, SVM in IML]
SVM TutorialLink opens in a new window
|
Gradient Descent and PerceptronLink opens in a new window
SVMLink opens in a new window
Assignment-1 Announced
|
3
Jan 22
|
Revision of SVMs
Scientific MethodLink opens in a new window
Why measure performance?Link opens in a new window
Accuracy and its assumptionsLink opens in a new window
Confusion matrix and associated metricsLink opens in a new window
ROC CurvesLink opens in a new window
PR CurvesLink opens in a new window
PR-ROC Relationship and codingLink opens in a new window
Estimating GeneralizationLink opens in a new window
(with CV, bootstrap estimation and statistical significance)
CV in sklearnLink opens in a new window
4.10 Using MLXTend
Twelve ways to fool the massesLink opens in a new window
|
SVM NotesLink opens in a new window
Chapter 19 “Design and Analysis of Machine Learning Experiments” Alpaydin, Ethem. 2010. Introduction to Machine Learning. Cambridge, Mass.: MIT Press.
Performance assessment Lecture NotesLink opens in a new window
Performance Assessment ExerciseLink opens in a new window
Ten ways to fool the masses with machine learningLink opens in a new window
MLXTENDLink opens in a new window
Optional Reading
Wainberg, Michael, Babak Alipanahi, and Brendan J. Frey. “Are Random Forests Truly the Best Classifiers?” Journal of Machine Learning Research 17, no. 110 (2016): 1–5.
Munir, Farzeen, Sadaf Gull, Amina Asif, and Fayyaz Ul Amir Afsar Minhas. “MILAMP: Multiple Instance Prediction of Amyloid Proteins.” IEEE/ACM Transactions on Computational Biology and Bioinformatics, August 22, 2019. https://doi.org/10.1109/TCBB.2019.2936846.
|
SVMLink opens in a new window
Performance Assessment ExerciseLink opens in a new window
Performance Assessment ExerciseLink opens in a new windowWork on Assignment-1
|
4
Jan 29
|
Performance Assessment Continueud
PCA Prelim: Old MacDonald meets LagrangeLink opens in a new window
PCA Prelim: Meet stubborn vectorsLink opens in a new window
PCA Prelim: Covariance and its friend CorrelationLink opens in a new window
PCALink opens in a new window
SRM view of PCALink opens in a new window
|
Chapter 19 “Design and Analysis of Machine Learning Experiments” Alpaydin, Ethem. 2010. Introduction to Machine Learning. Cambridge, Mass.: MIT Press.
Performance assessment Lecture NotesLink opens in a new window
Performance Assessment ExerciseLink opens in a new window
Ten ways to fool the masses with machine learningLink opens in a new window
MLXTENDLink opens in a new window
Optional Reading
Wainberg, Michael, Babak Alipanahi, and Brendan J. Frey. “Are Random Forests Truly the Best Classifiers?” Journal of Machine Learning Research 17, no. 110 (2016): 1–5.
Munir, Farzeen, Sadaf Gull, Amina Asif, and Fayyaz Ul Amir Afsar Minhas. “MILAMP: Multiple Instance Prediction of Amyloid Proteins.” IEEE/ACM Transactions on Computational Biology and Bioinformatics, August 22, 2019. https://doi.org/10.1109/TCBB.2019.2936846.
PCA: Lecture NotesLink opens in a new window
Eigen Values and VectorsLink opens in a new window
PCA TutorialLink opens in a new window
|
Performance Assessment ExerciseLink opens in a new window
Eigen Values and VectorsLink opens in a new window
PCA TutorialLink opens in a new window
Work on Assignment-1
|
5
Feb 5
|
Lectures 1 and 2
PCA Continued
Other dimensionality reduction methodsLink opens in a new window: MDS, t-SNE, ICA, LDA, Factor Analysis, Topic Modeling with NMF, CCA (no-video recording)
Lecture 3:
Interim Module EvaluationLink opens in a new window
OLSRLink opens in a new window
OLSR to SVRLink opens in a new window
Application: Hurricane Intensity RegressionLink opens in a new window
|
PCA: Lecture NotesLink opens in a new window
Lecture Notes (Regression) Publish
|
PCA TutorialLink opens in a new window
Work on Assignment-1
|
6
Feb 12
|
Lecture 1 and 2
How to go beyond classification, regression and dimensionality reductionLink opens in a new window
Applied SRM in Barebones PytorchLink opens in a new window
One class classifiers (Anomaly Detection)Link opens in a new window
RankingLink opens in a new window
Recommender SystemsLink opens in a new window
Still more problemsLink opens in a new window
ClusteringLink opens in a new window
Clustering in sklearnLink opens in a new window
Ensemble Methods and XGBoost
|
Lecture Notes (Regression)
Lecture Notes (SRM)Link opens in a new window
Finding Anti-CRISPR proteins with ranking (optional)Link opens in a new window
Using reinforcement learning to help a mouse escape a cat (optional)Link opens in a new window
|
Regression:
Regression in sklearnLink opens in a new window
A deeper look into regression (you can skip the "Advanced" topics)Link opens in a new window
Using regression for causal discovery (advanced)
Understanding REO and SRM:
Applied SRM in Barebones PytorchLink opens in a new window
Barebones Linear ModelsLink opens in a new window
Other ML problems:
ClusteringLink opens in a new window
Trees and XGBoostLink opens in a new window
Multilayer Perceptron:
Keras BarebonesLink opens in a new window
NN module in PytorchLink opens in a new window
MNIST MLP in PyTorchLink opens in a new window
- Solve the XOR using a single hidden layer BPNN with sigmoid activations
Assignment-2 Announced
|
7
Feb 19
|
Let me pick your brainLink opens in a new window
Single Neuron ModelLink opens in a new window
Multilayer PerceptronLink opens in a new window
Let's play with a neural networkLink opens in a new window
Deriving Backpropagation algorithm for MLPsLink opens in a new window
MLP in KerasLink opens in a new window
MLP in PyTorch using NN moduleLink opens in a new window
MLP in PyTorch for MNIST with DataloadersLink opens in a new window
Under the hood view of deep learning librariesLink opens in a new window
Improving learning of MLPsLink opens in a new window
|
Lecture NotesLink opens in a new window
|
Barebones Linear ModelsLink opens in a new window
Keras BarebonesLink opens in a new window
NN module in PytorchLink opens in a new window
MNIST MLP in PyTorchLink opens in a new window
- Solve the XOR using a single hidden layer BPNN with sigmoid activations
Universal Approximation CodeLink opens in a new window
|
8
Feb 26
|
Finding Waldo is difficult with a fully connecteed MLPLink opens in a new window
ConvolutionLink opens in a new window
Learning filters
CNNsLink opens in a new window
CNN training in PyTorchLink opens in a new window
Why CNNs
CNN Hyperparameters and RegularizationLink opens in a new window
|
Lecture NotesLink opens in a new window
|
World's simplest CNNLink opens in a new window
0 to AI in 10 lines of codeLink opens in a new window (By J. Pocock!)
Digit Classification with CNNs in Keras
Digit Classification with CNNs in PyTorchLink opens in a new window
Residual Neural NetworksLink opens in a new window
Transfer Learning in PyTorchLink opens in a new window
--
Work on Assignment-2
|
9
|
Transfer Learning and Application
Residual Networks (and other types)Link opens in a new window
Modern Architectures and Training Strategies
Transformers
GNNs
|
Lecture NotesLink opens in a new window
Deep PHURIE for hurricane intensity predictionLink opens in a new window
|
Simple Batch NormalizationLink opens in a new window
Transfer Learning in PyTorchLink opens in a new window
Minimalist Residual Net ExampleLink opens in a new window
Minimalist Transformer ImplementationLink opens in a new window
|
10 |
GNNs continued
Generative Learning
Conclusion
|
Lecture NotesLink opens in a new window |
Work on Assignment-2
--
Deep Autoencoder codeLink opens in a new window
Barebones GAN in PyTorchLink opens in a new window
Diffusion Model Tutorial
|
REVISION
10am-12pm on 23 April 2024 in MS.01
|
2024 Revision Session Moodle PostLink opens in a new window
2024 Session Recording:
Recording part-1Link opens in a new window
Recording part-2Link opens in a new window
2023 Revision Session RecordingLink opens in a new window
|
Revision Session NotesLink opens in a new window
self assessment questionsLink opens in a new window.
|
N/A
|
Lab Access and Machine Requirements
Remote Machine Login: https://warwick.ac.uk/fac/sci/dcs/intranet/user_guide/remote-login/
Use "module load cs909-python". The following guide will show you how to run jupyter-notebook as well as loading using notebooks with the module loaded in Visual Studio Code:
Notebook and VSCode Setup GuideLink opens in a new window
If using your own machine, you will be needing the listed libraries.
- Anaconda Python (3.6+)
- Jupyter Notebook or Jupyter Lab
- Matplotlib
- Numpy
- Scipy
- Pandas
- Scikit-learn
- Keras, PyTorch and TensorFlow (with GPU configuration if GPUs available)
Learning Python
The following resources may be useful when familiarising yourself with Python.
Python Documentation: https://docs.python.org/3/tutorial/index.html
NumPy Website: https://numpy.org
Matplotlib Website: https://matplotlib.org
Video Tutorials
Courtesy of Dr. Greg WatsonLink opens in a new window
- Introduction
- Basic Variables
- Lists
- Control Flow
- Functions
- Tuples
- Sets
- Dictionaries
- NumPy
- Matplotlib
- Classes