Programming Languages and Compilers for Machine Learning Seminar
Machine learning brings interesting new challenges for programming languages research in terms of making machine learning applications execute more efficiently but also in new different programming paradigms that are specifically tailored to machine learning tasks. In this seminar, we are reading recent papers on programming systems for machine learning, mostly deep learning. The papers we will read are about:
- Tensor DSLs. Tensors (aka multi-dimensional arrays) are at the core of deep neural networks.
- Differentiable Programming. A paradigm in which programs are differentiable and thus, some of their parameters can be learnt using gradient descent techniques. Backpropagation in neural nets is a special case of that.
- Program analysis approaches to certify or verify certain properties of neural nets.
It is beneficial to have some background in compilers (i.e. by having passed the compilers core course). You don't need to have a background in machine learning. We will cover the necessary foundations in the seminar.
People
Organization
Language | English | |
Participants | 12 / 12 (seats taken / maximum seats) | |
Waiting list | 0 (please attend the Preparatory Meeting) | |
Preparatory Meeting | Thursday, | |
Weekly Meeting | Thursdays, 16:00 s.t., E1.3 room 401 | |
Prerequisites | Preferably, you have taken part in the compiler construction course. |
Registration
Please use our seminar system. Note that you still have to register for the Seminar in the LSF until May 06th to get a certificate for the seminar.
Modus Operandi
A paper will be assigned to each participant. We will have weekly meetings during the semester in which we will discuss one of the assigned papers. The discussion will be managed by the student to whom the paper was assigned. She/he is responsible for giving a short summary on the paper and for structuring the following discussion.
Weekly Summaries
Every week each student has to write a plain text summary (max. 500 words) on the week's paper. This summary should include open questions and is to be submitted to Tina Jung three days before the corresponding meeting (23:59).
The submitted files must follow the naming scheme:
<two-digit-paper-number>_<matriculation-number>.txt
The summaries of all participants will be made available and can be used by the moderator to structure the discussion in the following meeting.
Each participant is allowed to drop two summaries without any particular reason. In case you drop a summary, please send a short mail telling so.
Final Talks
At the end of the semester each participant will give a presentation 30 minutes (25 min talk + 5 min questions) about her/his paper.
Dates
Sessions
Date | Moderator | Paper |
---|---|---|
02 May | Tajbeed Chowdhury | Vasilache et al. Tensor Comprehensions |
09 May | Valentin Seimetz | Kim et al. A Code Generator for High-Performance Tensor Contractions on GPUs |
16 May | Matthis Kruse | Kjolstad et al. The Tensor Algebra Compiler |
23 May | Leon Thiele | Baydin et al. Automatic Differentiation in Machine Learning: a Survey |
30 May | - | Public holiday |
06 June | Bohdan Liesnikov | Wang et al. Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator |
13 June | Daniel Spaniol | Michael Innes. Don’t unroll adjoint: Differentiating SSA-Form Programs |
20 June | - | Public holiday |
27 June | Kallistos Weis | Singh et al. An abstract domain for certifying neural networks |
04 July | Lukas Kirschner | Gopinath et al. Symbolic Execution for Deep Neural Networks |
11 July | Stefan Oswald | Sun et al. Concolic Testing for Deep Neural Networks |
18 July | Tina Jung | How to give a good presentation |
Final Talks
Date | Time | Speaker |
---|---|---|
23 July | 11:00 | Leon Thiele |
23 July | 11:30 | Bohdan Liesnikov |
23 July | 12:00 | Break |
23 July | 13:00 | Daniel Spaniol |
23 July | 13:30 | Tajbeed Chowdhury |
20 August | 11:30 | Valentin Seimetz |
20 August | 12:00 | Matthis Kruse |
20 August | 12:30 | Kallistos Weis |
20 August | 13:00 | Break |
20 August | 14:00 | Stefan Oswald |
20 August | 14:30 | Lukas Kirschner |
Papers
All papers are available from the university network (how to connect to the university network from home). A publicly available version is linked below whenever available.Tensor DSLs
- Abadi et al. TensorFlow
- Vasilache et al. Tensor Comprehensions
- Kim et al. A Code Generator for High-Performance Tensor Contractions on GPUs
- Kjolstad et al. The Tensor Algebra Compiler
- Kjolstad et al. Sparse Tensor Algebra Optimization with Workspaces
- Lattner et al. MLIR
Differentiable Programming
- Baydin et al. Automatic Differentiationin Machine Learning: a Survey.
- Wang et al. Backpropagation with Continuation Callbacks: Foundations for Efficient and Expressive Differentiable Programming
- Wang et al. Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator
- Michael Innes. Don’t unroll adjoint: Differentiating SSA-Form Programs
- Li et al. Differentiable Programming for Image Processing and Deep Learning in Halide
Verification of Neural Networks
- Singh et al. An abstract domain for certifying neural networks
- Gopinath et al. Symbolic Execution for Deep Neural Networks.
- Sun et al. Concolic Testing for Deep Neural Networks.
Background Material
- MIT introduction course to deep learning. The first and third video are particularly helpful for the background required in this seminar.
- Gordon Plotkin. Differentiable Programming. Talk at POPL 18.
- Christopher Olah. Neural Networks, Types, and Functional Programming. Blog Entry.