BLACKBOARD PAGE | UNIT INFO |

COMSM0152 - Foundations of ProAI


Unit Information

This unit introduces students to a range of methodologies and applications across the field of AI. Each week will focus on a different theme, with three activities:

  1. Required reading -- background reading to introduce the topic of the week
  2. Guest seminars (TB2: Mondays, 1300-1400, Queen's 1.58) -- a guest speaker will present an introduction to their research area, connected to the week's topic
  3. Group discussion (TB2: Fridays, 1000-1100, Queen's 1.68) -- student led discussion on the required reading and guest seminars. This is a chance to go over the important concepts and raise questions. The sessions will be chaired by a student, to facilitate the discussion (they are not required to present the topic). Link to spreadsheet with session chairs
  4. AI Lab Lunch and Learn Seminars: (usually Wednesdays at 1330-1500, Queen's 1.07) -- these sessions are not exclusive to the CDT and we invite the whole AI Lab and beyond, with visiting researchers as well as our local speakers giving talks, and, importantly, free pizza. Many foundational and applied AI topics are covered here, so it is worth attending.

Bear with us while we organise the schedule below -- Topics are likely to change a bit as we arrange with our guest speakers. Please do give us feedback at any time on how we can make the unit work better! In particular, if you find some topics assume prior knowledge you don't have, or would like to go further into certain topics, let us know. Please see below for some great all-round textbooks on AI, which may help you fill in some knowledge gaps... We also have a Team on MS Teams. Please do use it to ask us questions, discuss AI topics you are interested in, post interesting blogs, videos or papers, etc...

Staff

James Cussens (JC) Edwin Simpson (ES)


Unit Materials

TB1

TB2

Week Reading (to complete before Monday's lecture) Guest Seminar Discussion chair (Link to spreadsheet with session chairs) Lecturer responsible
1 (w/c 22/09/25) Embeddings and Deep Learning for NLP Edwin Simpson ES
2 (w/c 29/09/25) Attention and Transformers. Both of the following are really good introductions to attention and transformers - you can pick one to read, or go through both if that helps your understanding. Please cover one of these before Monday: For the Friday reading group, please also read: Mike Wray ES
3 (w/c 06/10/25) Large language models: Conor Houghton -- linguistic perspectives on LLMs ES
4 (w/c 13/10/25) Data and labels for image and Video understanding:
  • ImageNet, the competition that kick-started the deep learning revolution in Computer Vision
  • Kitti, a highly impactful dataset in vision and robotics.
  • Optional extra: The winner of the ImageNet 2015 competition, ResNet.
Dima Damen -- tutorial on data and labels for video understanding ES
5 (w/c 20/10/25) Philosophy of AI TBC JC
6 (w/c 27/10/25) CONSOLIDATION WEEK (no seminar/reading group)
7 (w/c 03/11/25) Knowledge representation and reasoning TBC JC
8 (w/c 10/11/25) Bayesian inference and decision making TBC JC
9 (w/c 17/11/25) AI for Health Qiang Liu ES
10 (w/c 24/11/25) Bias, fairness and transparency TBC JC
11 (w/c 01/12/25) Ethical and regulatory frameworks of AI TBC JC
12 (w/c 08/12/25) Assessment period TB1 essay due this week
13 (w/c 19/01/26) TBC: Causality or Discrete and continuous optimisation? James Cussens JC
14 (w/c 26/01/26) Reinforcement learning TBC ES
15 (w/c 02/02/26) Robotics (TBC) TBC ES
16 (w/c 09/02/26) Multi-agent systems TBC JC
17 (w/c 16/02/26) Weak supervision TBC ES
18 (w/c 23/02/26) CONSOLIDATION WEEK No seminar/reading group
19 (w/c 02/03/26) Learning from Temporal Data for Healthcare (TBC) TBC JC
20 (w/c 09/03/26) Explainable and interpretable AI TBC ES
21 (w/c 16/03/26) Privacy TBC JC
Easter Vacation (w/c 23/03/26)
22 (w/c 13/04/26) Robust AI or MLOps and deploying AI in production (TBC) TBC ES
23 (w/c 20/04/26) Human-in-the-loop AI -- design and evaluation: TBC ES

Assessment Details

Deadlines: at end of TB1 and end of TB2, dates TBC

After each Teaching Block students submit an essay of about 5,000 words (10 pages) on a research topic jointly chosen by them and their Academic Mentor. The essay should describe the background, state of the art, and open challenges with regard to the chosen topic. Each essay is assessed on a pass/fail basis in terms of scholarly content and academic writing. Narrative feedback is also provided, indicating strong points as well as areas for improvement. Passing the unit requires passing both essays. More guidance will be provided by the unit lecturers during the first term.


Text books

  1. Bishop, C. M., Pattern recognition and machine learning (2006). This is one of the best ML textbooks and will provide a solid foundation across many aspects of ML. The book is freely available here.
  2. Russell, S. and Norvig, P., Artificial Intelligence, A Modern Approach, 4th Edition (2020). The canonical introduction to AI, 3rd edition available at here.
  3. Jurafsky, D. and Martin, J.H., Speech and Language Processing, 3rd edition drafts (2024). A great NLP textbook, but very readable and an interesting way to think about the challenges of designing AI systems in general. Also good for showing a contrast between deep learning and feature engineering approaches. Online only, here.
  4. Murphy, K., Probabilistic Machine Learning: An Introduction (2022) and Murphy, K., Probabilistic Machine Learning: Advanced Topics (2023). A more recent ML textbook with particularly good coverage of probabilistic methods, freely available via here.
Some of you may prefer videos to books. Youtube has many video lectures, including the Machine Learning course from Stanford by Andrew Ng, which is very good.