BLACKBOARD PAGE | UNIT INFO |

COMSM0152 - Foundations of ProAI


Unit Information

This unit introduces students to a range of methodologies and applications across the field of AI. Each week will focus on a different theme, with three activities:

  1. Required reading -- background reading to introduce the topic of the week
  2. Guest seminars (usually Mondays, 1300-1400, Queen's 1.58) -- a guest speaker will present an introduction to their research area, connected to the week's topic
  3. Group discussion (Fridays, 1200-1300, Queen's 1.69) -- student led discussion on the required reading and guest seminars. This is a chance to go over the important concepts and raise questions. The sessions will be chaired by a student, to facilitate the discussion (they are not required to present the topic). Link to spreadsheet with session chairs

This is a new unit, and you are a small cohort, so please do give us feedback at any time on how we can make the unit work better! In particular, if you find some topics assume prior knowledge you don't have, or would like to go further into certain topics, let us know. Please see below for some great all-round textbooks on AI, which may help you fill in some knowledge gaps... We also have a Team on MS Teams. Please do use it to ask us questions, discuss AI topics you are interested in, post interesting blogs, videos or papers, etc...

Staff

James Cussens (JC) Edwin Simpson (ES)


Unit Materials

TB1

TB2

Week Reading (to complete before Monday's lecture) Guest Seminar Discussion chair (Link to spreadsheet with session chairs) Lecturer responsible
1 (w/c 16/09/24) Data and labels for image and Video understanding:
  • ImageNet, the competition that kick-started the deep learning revolution in Computer Vision
  • Kitti, a highly impactful dataset in vision and robotics.
  • Optional extra: The winner of the ImageNet 2015 competition, ResNet.
Dima Damen -- tutorial on data and labels for video understanding Jono ES
2 (w/c 23/09/24) Text embeddings and sequence processing: Edwin Simpson Moses ES
3 (w/c 30/09/24) Attention and Transformers. Both of the following are really good introductions to attention and transformers - you can pick one to read, or go through both if that helps your understanding. Please cover one of these before Monday: For the Friday reading group, please also read: Mike Wray James ES
4 (w/c 07/10/24) Large language models: Conor Houghton -- linguistic perspectives on LLMs David ES
5 (w/c 14/10/24) Philosophy of AI: James Ladyman Berenika JC
6 (w/c 21/10/24) CONSOLIDATION WEEK (no seminar/reading group)
7 (w/c 28/10/24) Knowledge representation and reasoning: Peter Flach Moses JC
8 (w/c 04/11/24) Bayesian inference and decision making: Laurence Aitchison (slides) Jack JC
9 (w/c 11/11/24) Causality: James Cussens David JC
10 (w/c 18/11/24) Bias, fairness and transparency: Miranda Mowbray Jake JC
11 (w/c 25/11/24) Ethical and regulatory frameworks of AI: Andrew Charlesworth Jack JC
12 (w/c 09/12/24) Assessment period TB1 essay due
13 (w/c 13/01/25) Discrete and continuous optimisation: placeholder, James Cussens (student name TBC) JC
14 (w/c 20/01/25) Reinforcement learning: placeholder, Taku Yamagata (student name TBC) ES
15 (w/c 27/01/25) Robotics and physically embodied agents: placeholder, Nathan Lepora (student name TBC) ES
16 (w/c 03/02/25) Multi-agent systems: placeholder, Nirav Ajmeri (student name TBC) JC
17 (w/c 10/02/25) Weak supervision: placeholder, Raul Santos-Rodriguez (student name TBC) ES
18 (w/c 17/02/24) CONSOLIDATION WEEK No seminar/reading group
19 (w/c 24/02/25) Human-in-the-loop AI -- design and evaluation: placeholder, Kenton O'Hara (student name TBC) JC
20 (w/c 03/03/25) Explainable and interpretable AI: placeholder, Weiru Liu (student name TBC) ES
21 (w/c 10/03/25) Privacy: placeholder, Miranda Mowbray (student name TBC) JC
22 (w/c 17/03/25) AIOps and deploying AI in production: placeholder, TBC (student name TBC) ES
23 (w/c 24/03/25) AI in healthcare and biomedicine: placeholder, Zahraa Abdallah (student name TBC) ES
24 (w/c 16/09/24) Assessement preparation week TB2 essays due

Assessment Details

At the end of each Teaching Block students submit an essay of about 5,000 words (10 pages) on a research topic jointly chosen by them and their Academic Mentor. The essay should describe the background, state of the art, and open challenges with regard to the chosen topic. Each essay is assessed on a pass/fail basis in terms of scholarly content and academic writing. Narrative feedback is also provided, indicating strong points as well as areas for improvement. Passing the unit requires passing both essays. More guidance will be provided by the unit lecturers during the first term.

Text books

  1. Bishop, C. M., Pattern recognition and machine learning (2006). This is one of the best ML textbooks and will provide a solid foundation across many aspects of ML. The book is freely available here.
  2. Russell, S. and Norvig, P., Artificial Intelligence, A Modern Approach, 4th Edition (2020). The canonical introduction to AI, 3rd edition available at here.
  3. Jurafsky, D. and Martin, J.H., Speech and Language Processing, 3rd edition drafts (2024). A great NLP textbook, but very readable and an interesting way to think about the challenges of designing AI systems in general. Also good for showing a contrast between deep learning and feature engineering approaches. Online only, here.
  4. Murphy, K., Probabilistic Machine Learning: An Introduction (2022) and Murphy, K., Probabilistic Machine Learning: Advanced Topics (2023). A more recent ML textbook with particularly good coverage of probabilistic methods, freely available via here.
Some of you may prefer videos to books. Youtube has many video lectures, including the Machine Learning course from Stanford by Andrew Ng, which is very good.