Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Topics Overview


See detailed descriptions of the sessions below.

Notes:

  • This schedule may be changed, should the need arise.

  • You are not required to read anything. However, you are strongly encouraged to read sources marked by pin emojis 📌: those are comprehensive overviews on the topics or important works that are beneficial for a better understanding of the key concepts.

  • Sources marked with a popcorn emoji 🍿 is misc material you might want to take a look at: blog posts, GitHub repos, leaderboards etc.

  • For the labs, you are provided with practical tutorials that the respective lab tasks will mostly derive from. The core tutorials are marked with a writing emoji ✍️; you are asked to inspect them in advance (better yet: try them out).

Disclaimer: the reading entries are no proper citations; detailed infos about the authors, publication date, venue etc. can be found under the entry links.


October: INTRO

Week 1

22.10. Intro

That is an introductory meeting, in which I we will cover the contents and the schedule of the course, the class formats and the formalia, and where all your questions, suggestions etc. will be discussed.

Key points:

23.10. Lecture: Ontological Status of LLMs

This lecture will suggest a warm-up discussion about different perspectives on LLM nature. We will focus on two prominent outlooks: LLM as as a complex statistical machine vs LLM as a form of intelligence. We’ll discuss differences of LLM and human intelligence and the degree to which LLMs exhibit (self-)awareness.

Key points:

Sources:

Week 2

29.10. Lecture: Lecture: LLM & Agent Basics

In this lecture, we’ll recap the basics of LLMs and LLM-based agents to make sure we’re on the same page.

Key points:

Sources:

30.10. Lab: Intro to LangChain

This is the first lab which will guide you through the basic concepts of LangChain for the further practical sessions.

Sources:

November-December: CORE

Week 3

05.11. Lecture: Virtual Assistants

The first core topic addresses single-LLM virtual assistants such as chatbots and RAG systems. We’ll discuss how these systems are built and how you can tune them for your use case.

Key points:

Sources:

06.11. Lab: LLM-based Chatbot

On material of Lecture: Virtual Assistants

In this lab, we’ll build a chatbot and try different prompts and settings to see how it affects the output.

Sources:

Week 4

12.11 & 13.11. Labs: RAG

On material of Lecture: Virtual Assistants

In this lab, we’ll start expanding the functionality of the chatbot built at the last lab to connect it to our user-specific information. On the first day, we’ll preprocess our custom data for further retrieval. The following day we’ll complete move from data preprocessing to implementing the RAG workflow.

Sources:

Week 5

19.11. Lecture: Multi-agent Environment

This lectures directs its attention to automating everyday / business operations in a multi-agent environment. We’ll look at how agents communicate with each other, how their communication can be guided (both with and without involvement of a human), and how this is used in real applications.

Key points:

Sources:

20.11. Lab: Multi-agent Environment

On material of Lecture: Multi-agent Environment

This lab will introduce a short walkthrough to creation of a multi-agent environment for automated meeting scheduling and preparation. We will see how the coordinator agent will communicate with two auxiliary agents to check time availability and prepare an agenda for the meeting.

Sources:

Weeks 6-7

26.11 & 27.11 & 03.12 & 04.12. Labs: LLM-powered Website

This is a mini-cycle of labs, where you will individually build a multi-agent system. These labs are needed for you to practice the technical implementation of such systems, identify and work on your weak spots, as well as discuss all the doubts and difficulties you encounter. Consider it preparation for the final project. As a result, you will create a workflow to generate websites with LLMs. The LLMs will generate both the contents and the code required for rendering, styling and navigation by communicating in a semi-centralized manner and using reasoning and critique to improve the output.

Sources:

Week 8

10.12. Lecture: Role of AI in Recent Years

The last lecture of the course will turn to societal considerations regarding LLMs and AI in general and will investigate its role and influence on the humanity nowadays.

Key points:

Sources:

11.12. Wrap-up

This informal meeting will give a small summary with key takeaways from the course. We will also discuss the next steps such as project requirements, proposal procedure etc.

Key points:

Week 9

17.12. Debate: Role of AI in Recent Years

On material of Lecture: Role of AI in Recent Years

The core block of the course will be concluded by the final debates about the role of AI in recent years. Debate topics well be announced on 10.12.

Sources: see Lecture: Role of AI in Recent Years

18.12. Project Proposals

In this meeting, you will introduce your project proposals. The goal of this session is to receive feedback on your idea from your peer students and me in order to adjust the idea if necessary. Additionally, the intermediate consultations for the project groups will be scheduled.

Key points:

January-February: PROJECTS

Weeks 10-13

This time is given to you to implement the projects as well as to prepare a short presentation for the final week. During this time, there will be a few consultations for the project groups, where we will be inspecting the intermediate progress and addressing the issues.

Week 14

04.02 & 05.02. Project Presentations

Finally, the last two sessions of the course will be dedicated to your project presentations.

References
  1. Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜. Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 610–623. 10.1145/3442188.3445922