Welcome to CS909/CS429 Data Mining in Term-2 of 2023! This webpage is the primary source of information for all updates, announcements and content for this module.
Announcements
We will be adding an major announcements for the module below.
Module Teaching Team
Instructor: Fayyaz MinhasLink opens in a new window
Teaching Fellow: Anna Guimarães
Teaching Assistants
Name |
Email |
Srijay Deshpande |
Srijay.Deshpande@warwick.ac.uk |
Rob Jewsbury |
Rob.Jewsbury@warwick.ac.uk |
Saif Anwar |
Saif.Anwar@warwick.ac.uk |
Dang Vu |
Quoc-Dang.Vu@warwick.ac.uk |
Muhammad Dawood |
Muhammad.Dawood@warwick.ac.uk |
George Wright |
George.Wright.1@warwick.ac.uk |
Communication between students and teaching team
Please use moodle for non-urgent communication. It is very difficult to monitor and respond to emails from individual students due to the large size of the class.
For questions about Logistics or other issues, the first point of contact is Anna Guimarães.
Moodle Link (For QA & Discussions)
We shall be using this moodle page for our module.
Each week, the instructors will create a post in moodle for students to post their questions about the lectures in that week.
The instructors will also be posting a series of questions on Moodle each week as well that you can answer for self-assessment and feedback. As the goal of these questions is to encourage students to explore and self study, answers to all of these questions will not be provided but students are welcome to discuss them with the instructors in lab sessions or as questions in lectures.
Timetable
Lectures - Weeks 1–10
Time
|
Location
|
Monday 11:00 - 12:00 |
R0.21 |
Thursday 12:00 - 13:00 |
L3 |
Friday 13:00 - 14:00 |
MS.01 |
Lab Sessions - Weeks 1–10
Each student has been allocated a lab session. Please check your scheduled lab session on Tabula and attend that. Please make sure that you attend the whole of an assigned lab session so that you do not miss attendance.
TA allocation to labs may change from week to week due to scheduling constraints. However, there should be at least one TA common between sessions.
NOTE: The following information may change if there are changes to face-to-face teaching.
Students who might need to change group (due to mitigating circumstances or a timetable clash) should email queries to the relevant resource email account (DCS.UG.Support@warwick.ac.uk / DCS.PGT.Support@warwick.ac.uk). The teaching staff wil only be able to sign-post to support teams.
Lab
|
Time
|
Location
|
TA Allocation (Week 1)
|
Teams Link
|
Lab 1 |
Thursday 10:00 - 11:00 |
MB3.17
|
SD & DV |
N/A |
Lab 2 |
Thursday 10:00 - 11:00 |
CS0.01 |
RJ & MD |
N/A |
Lab 3 |
Thursday 11:00 - 12:00 |
CS0.01 |
RJ & MD |
N/A |
Lab 4 |
Thursday 11:00 - 12:00 |
MB3.17 |
SD & DV |
N/A |
Lab 5 |
Friday 09:00 - 10:00 |
MB3.17 |
GW & SA |
N/A |
Lab 6 |
Friday 10:00-11:00 |
MB3.17 |
GW & SA |
N/A |
Coursework Assignments
Books and Other resources
[PML] Probabilistic Machine Learning: An Introduction by Kevin Patrick Murphy. MIT Press, 2021. link: http://mlbayes.ai/
[IML] Introduction to Machine Learning 3e by Ethem Alpaydin (selected chapters: ch. 1,2,6,7,9,10,11,12,13)
[DBB] Deep Learning by Ian Goodfellow, Yoshua Bengio, Aaron Courville, (Ch 1-5 if needed as basics), Ch. 6,7,8,9 link: https://www.deeplearningbook.org/Link opens in a new window
[FNN] Fundamentals of Neural Networks : Architectures, Algorithms And Applications by Laurene Fausett, (ch. 2,6)
[PRML] Pattern Recognition and Machine Learning by Chris Bishop link: https://www.microsoft.com/en-us/research/people/cmbishop/prml-book/
Casual Reading:
The master algorithm
The Alignment Problem
The book of why
Course Materials
Slides and reading materials will be posted each week. The lab session will be available prior to the start of the lab session.
NOTE:
- It is strongly recommended that you attend all lectures in person. However, if you are unable to attend lectures due to a genuine issue, you can use the l inks to archived lecture recordings from previous years which are available at this Course Stream ChannelLink opens in a new window as well as on YoutubeLink opens in a new window (https://bit.ly/2RannLB ). However, this year's lectures would be significantly different in content and delivery from previous ones and we cannot guarantee the sufficiency of archived content for effective learning in terms of success in coursework or examination.
Week
|
Lectures
|
Reading/Resources
|
Labs
|
1
(Jan 9)
|
Introduction (streamLink opens in a new window, YTLink opens in a new window)
Why Data Science? (streamLink opens in a new window)
Applications (streamLink opens in a new window)
Research Applications (streamLink opens in a new window)
FrameworkLink opens in a new window
SurveyLink opens in a new window
Project SuggestionsLink opens in a new window
Classification and Linear DiscriminantsLink opens in a new window
Determining Linear SeparabilityLink opens in a new window
Prelim: Gradients and Gradient descent
Prelim: Gradient Descent CodeLink opens in a new window
Prelim: ConvexityLink opens in a new window
Perceptron ModelingLink opens in a new window
Perceptron CodeLink opens in a new window
|
Introduction SlidesLink opens in a new window
Applications and Framework SlidesLink opens in a new window
k-Nearest Neighbor AlgorithmLink opens in a new window [Required]
[PML] Chapter-1, [IML] Chapter-1
CRISPR TalkLink opens in a new window
Whole Slide Images are Graphs TalkLink opens in a new window
A few useful things to know about machine learningLink opens in a new window
Linear Discriminants (notes)Link opens in a new window
Preliminaries (notes)Link opens in a new window
Building Linear Models (notes)Link opens in a new window
Gradient Descent Code (py)Link opens in a new window
Perceptron Code (py)Link opens in a new window
Perceptron AlgorithmLink opens in a new window
Self-Assessment Exercise QuestionsLink opens in a new window
Post your QuestionsLink opens in a new window
Perceptron Classification VideosLink opens in a new window
|
Learning Python
Implementing kNN classifierLink opens in a new window
Gradient Descent and PerceptronLink opens in a new window
|
2
Jan 16
|
What's special in an SVM?Link opens in a new window
SVM FormulationLink opens in a new window
A brief history of SVMsLink opens in a new window
Coding an SVM and CLink opens in a new window
Margin and RegularizationLink opens in a new window
Linear Discriminants and Selenite CrystalsLink opens in a new window
Selenite Crystals bend SpaceLink opens in a new window
Using transformations to foldLink opens in a new window
Transformations change distance and dot productsLink opens in a new window
Kernelized SVMs
|
Self Assessment QuestionsLink opens in a new window
Post your QuestionsLink opens in a new window
Perceptron Classification VideosLink opens in a new window
Your Favoriate ML/AI ApplicationLink opens in a new window
SVM NotesLink opens in a new window
SVM AppletLink opens in a new window
Regularized PerceptronLink opens in a new window
Transformations codeLink opens in a new window
Fold and Cut TheoreomLink opens in a new window
Book Reading [SVM in PML, SVM in IML]
SVM TutorialLink opens in a new window
Why Intuition Fails in Higher DimensionsLink opens in a new window and The surprising behavior of distance metrics in high dimensionsal spacesLink opens in a new window and code Link opens in a new window(Optional)
|
Gradient Descent and PerceptronLink opens in a new window
SVMLink opens in a new window
Assignment-1 AnnouncedLink opens in a new window
|
3
Jan 23
|
SVM Continueud
Scientific MethodLink opens in a new window
Why measure performance?Link opens in a new window
Accuracy and its assumptionsLink opens in a new window
Confusion matrix and associated metricsLink opens in a new window
ROC CurvesLink opens in a new window
PR CurvesLink opens in a new window
PR-ROC Relationship and codingLink opens in a new window
Estimating GeneralizationLink opens in a new window
(with CV, bootstrap estimation and statistical significance)
CV in sklearnLink opens in a new window
4.10 Using MLXTend
Twelve ways to fool the massesLink opens in a new window
|
Chapter 19 “Design and Analysis of Machine Learning Experiments” Alpaydin, Ethem. 2010. Introduction to Machine Learning. Cambridge, Mass.: MIT Press.
Lecture NotesLink opens in a new window
Performance Assessment ExerciseLink opens in a new window
Ten ways to fool the masses with machine learningLink opens in a new window
MLXTENDLink opens in a new window
Self Assessment QuestionsLink opens in a new window
Post Your QuestionsLink opens in a new window
Optional Reading
Wainberg, Michael, Babak Alipanahi, and Brendan J. Frey. “Are Random Forests Truly the Best Classifiers?” Journal of Machine Learning Research 17, no. 110 (2016): 1–5.
|
SVMLink opens in a new window
Performance Assessment ExerciseLink opens in a new window
Work on Assignment-1
|
4
Jan 30
|
Prelim: Old MacDonald meets LagrangeLink opens in a new window
Prelim: Meet stubborn vectorsLink opens in a new window
Prelim: Covariance and its friends CorrelationLink opens in a new window
PCALink opens in a new window
Other dimensionality reduction methodsLink opens in a new window
SRM view of PCALink opens in a new window
|
Lecture NotesLink opens in a new window
Eigen Values and VectorsLink opens in a new window
PCA TutorialLink opens in a new window
Self Assessment QuestionsLink opens in a new window
Post Your QuestionsLink opens in a new window
Mid-term FeedbackLink opens in a new window
|
Performance Assessment ExerciseLink opens in a new window
Eigen Values and VectorsLink opens in a new window
PCA TutorialLink opens in a new window
Work on Assignment-1
|
5
Feb 6
|
Review of PCA (in class)
OLSRLink opens in a new window
OLSR to SVRLink opens in a new window
Application: Hurricane Intensity RegressionLink opens in a new window
How to go beyond classification, regression and dimensionality reductionLink opens in a new window
Applied SRM in Barebones PytorchLink opens in a new window
One class classifiers (Anomaly Detection)Link opens in a new window
RankingLink opens in a new window
Recommender SystemsLink opens in a new window
Still more problemsLink opens in a new window
ClusteringLink opens in a new window
Clustering in sklearnLink opens in a new window
Ensemble Methods and XGBoost
Model Interpretability: SHAP and LIME
|
SRM View of PCALink opens in a new window
Lecture Notes (Regression)
Lecture Notes (SRM)Link opens in a new window
Self Assessment Questions
Post Your QuestionsLink opens in a new window
Lecture NotesLink opens in a new window
Finding Anti-CRISPR proteins with ranking (optional)Link opens in a new window
Using reinforcement learning to help a mouse escape a cat (optional)Link opens in a new window
|
SRM View of PCALink opens in a new window
PCA TutorialLink opens in a new window
ClusteringLink opens in a new window
Trees and XGBoostLink opens in a new window
Applied SRM in Barebones PytorchLink opens in a new window
Work on Assignment-1
|
6
Feb 13
|
Let me pick your brainLink opens in a new window
Single Neuron ModelLink opens in a new window
Revisit Applied SRM in Barebones PytorchLink opens in a new window
Multilayer PerceptronLink opens in a new window
Let's play with a neural networkLink opens in a new window
Deriving Backpropagation algorithm for MLPsLink opens in a new window
MLP in KerasLink opens in a new window
MLP in PyTorch using NN moduleLink opens in a new window
MLP in PyTorch for MNIST with DataloadersLink opens in a new window
Improving learning of MLPsLink opens in a new window
|
Lecture NotesLink opens in a new window (All deep learning Notes)
Self Assessment QuestionsLink opens in a new window
Questions for Week-6Link opens in a new window
|
ClusteringLink opens in a new window
Trees and XGBoostLink opens in a new window (make sure to use python3.9Link opens in a new window for this)
Applied SRM in Barebones PytorchLink opens in a new window
SHAP Analysis
Barebones Linear ModelsLink opens in a new window
Keras BarebonesLink opens in a new window
NN module in PytorchLink opens in a new window
MNIST MLP in PyTorchLink opens in a new window
- Solve the XOR using a single hidden layer BPNN with sigmoid activations
Assignment-2 AnnouncedLink opens in a new window
|
7
Feb 20
|
We can approximate the universeLink opens in a new window
By going deepLink opens in a new window
Finding Waldo is difficult with a fully connecteed MLPLink opens in a new window
ConvolutionLink opens in a new window
Learning filters
CNNsLink opens in a new window
CNN training in PyTorchLink opens in a new window
Why CNNs
CNN Hyperparameters and RegularizationLink opens in a new window
Transfer Learning and Application
|
Lecture NotesLink opens in a new window (All deep learning Publish Notes)
|
Barebones Linear ModelsLink opens in a new window
Keras BarebonesLink opens in a new window
NN module in PytorchLink opens in a new window
MNIST MLP in PyTorchLink opens in a new window
- Solve the XOR using a single hidden layer BPNN with sigmoid activations
Universal Approximation CodeLink opens in a new window
Convolution in PyTorchLink opens in a new window
Learning a single convolution FilterLink opens in a new window
0 to AI in 10 lines of codeLink opens in a new window (By J. Pocock!)
Digit Classification with CNNs in Keras
Digit Classification with CNNs in PyTorchLink opens in a new window
Transfer Learning in PyTorchLink opens in a new window
Work on Assignment-2
|
8
Feb 27
|
See notes from Week 7
Under the hood view of deep learning librariesLink opens in a new window
Residual Networks (and other types)Link opens in a new window
|
Lecture NotesLink opens in a new window (All deep learning Notes)
|
Work on Assignment-2
Universal Approximation CodeLink opens in a new window
Convolution in PyTorchLink opens in a new window
Learning a single convolution FilterLink opens in a new window
0 to AI in 10 lines of codeLink opens in a new window (By J. Pocock!)
Digit Classification with CNNs in Keras
Digit Classification with CNNs in PyTorchLink opens in a new window
Transfer Learning in PyTorchLink opens in a new window
Work on Assignment-2
|
9
March 6
|
Under the hood view of deep learning librariesLink opens in a new window
Residual Networks (and other types)Link opens in a new window
Modern Architectures and Training Strategies: Transformers and Self-Supervised Learning
Interpretability and Explainability: GradCam, Loss Visualization and TCAV
AutoencodersLink opens in a new window
|
Lecture NotesLink opens in a new window (All deep learning Notes)
Deep PHURIE for hurricane intensity predictionLink opens in a new window
|
Transfer Learning in PyTorchLink opens in a new window
Work on Assignment-2
|
10
March 13
|
AutoencodersLink opens in a new window
Attention and Transformers
Generative ModelsLink opens in a new window
GANsLink opens in a new window
Barebones GAN in PyTorchLink opens in a new window
Diffusion Models
Using GANs for generating histology images (By S. Deshpande) [optional]Link opens in a new window
Using Graph Neural Netwoks for histology images Link opens in a new window[optional]Link opens in a new window
Graph Neural Networks
Other topics:
Natural Language ModellingLink opens in a new window
Symbolic Regression
Robustness
On Causality, Invariance and Symmetries (Invariant Risk Minimization)
|
Lecture NotesLink opens in a new window (All deep learning Notes)
GAN in PyTorchLink opens in a new window
NLPLink opens in a new window
|
Barebones GAN in PyTorchLink opens in a new window
Diffusion Model Tutorial
|
May 09, 2023
|
Revision Session Video
(apologies for the unexplained clipping in the recording but notes are available)
|
Revision NotesLink opens in a new window
|
|
Lab Access and Machine Requirements
Remote Machine Login: https://warwick.ac.uk/fac/sci/dcs/intranet/user_guide/remote-login/
Use "module load cs909-python" or if using your own machine, you will be needing.
- Anaconda Python (3.6+)
- Jupyter Notebook or Jupyter Lab
- Matplotlib
- Numpy
- Scipy
- Pandas
- Scikit-learn
- Keras, PyTorch and TensorFlow (with GPU configuration if GPUs available)
Learning Python
The following resources may be useful when familiarising yourself with Python.
Python Documentation: https://docs.python.org/3/tutorial/index.html
NumPy Website: https://numpy.org
Matplotlib Website: https://matplotlib.org
Video Tutorials
Courtesy of Dr. Greg WatsonLink opens in a new window
- Introduction
- Basic Variables
- Lists
- Control Flow
- Functions
- Tuples
- Sets
- Dictionaries
- NumPy
- Matplotlib
- Classes