Martin Lindström

PhD Student at KTH

Profile pic

Hi!

I’m a PhD student at KTH Royal Institute of Technology, Stockholm. I use my background in information theory and error-correcting codes to analyse deep learning systems. I’m passionate about developing more efficient learning algorithms, as well as understanding why they work. This helps us democratise AI – not just by making tools more accessible, but by designing systems which we can trust and understand.

Recently, I’ve been exploring how theoretical tools can help us design the “right” assumptions (or inductive biases) so we can start to open up black-box algorithms and see what really drives their performance.

I’m fortunate to be supervised by Ragnar Thobaben and Mikael Skoglund. Before starting my PhD, I did both my MSc and BSc in electrical engineering at KTH. I also spent a year as an exchange student at Imperial College London, where I wrote my MSc thesis with Deniz Gündüz at the Information Processing and Communications Lab.

If any of this sounds interesting, I’d love to hear from you!


Papers

A Coding-Theoretic Analysis of Hyperspherical Prototypical Learning Geometry

Martin Lindström, Borja Rodríguez-Gálvez, Ragnar Thobaben, and Mikael Skoglund

GRaM Workshop @ ICML 2024 (PMLR 251)

Illustration

Power Injection Attacks in Smart Distribution Grids with Photovoltaics

Martin Lindström, Hampei Sasahara, Xingkang He, Henrik Sandberg, and Karl Henrik Johansson

2021 European Control Conference (ECC)


Teaching Experience

KTH Stochastic Signals and Systems: Autumns 2022-2024
3rd year BSc course on basic analysis of stochastic signals and systems. My TA duties included: hosting tutorial sessions, marking exams, and creating and marking project assignments.
KTH Student Supervision:
Supervision for MSc theses (30 ECTS, equivalent to half an academic year of study), typically one student per year for a project pitched by engineering companies. Occasionally supervision of summer research interns.
KTH Optimal Filtering: Autumn 2022
Joint MSc and PhD course on designing (primarily linear) optimal filters, in the MMSE sense. My TA duties included marking homework assignments.
KTH Introduction to Computing Systems Engineering: Springs 2019 and 2020
1st year BSc course where students design a computer from scratch. My TA duties mainly included helping students with lab assignments.

Miscellaneous

Good Books
Gunnar Karlsson's list of good books is excellent. It contains world classics, with a bias towards European literature in general, and Swedish literature in particular. If you have any good book suggestions, please reach out!
Good Textbooks and Lecture Notes
Here are some of my favourite textbooks and lecture notes — and a short comment on why I like them.
  • "Elements of Information Theory" by Cover & Thomas — maybe the most user-friendly introduction to information theory
  • "Information Theory: From Coding to Learning" by Polyanskiy & Wu — a more advanced treatment of information theory, which contains almost all you need to know about the topic
  • "Convex Optimization" by Boyd & Vandenberghe — user-friendly introduction to optimisation problems which any self-respecting ML researcher should have a cursory understanding of
  • "Understanding Machine Learning: From Theory to Algorithms" by Shalev-Shwartz and Ben-David — a good introduction to classical theoretical machine learning, such as PAC-learnability, SVMs, etc.
  • "A Course in Real Analysis" by McDonald and Weiss — an introduction to measure-theoretic probability which I think is more accessible than the more popular works by Kallenberg and Cinlar