Skip to main content

Posts

2025


Knowledge Distillation, Pruning & Quantization: Techniques for Optimizing AI Models

How can AI become more sustainable in times of climate change? In an extensive course at the Hasso Plattner Institute, I learned the latest techniques for making large language models more efficient and resource-friendly. From knowledge distillation to network pruning and low-bit quantization.

Goodbye, Mental Clutter

For a long time, I was looking for a solution to keep track of my ideas, tasks, projects, and long-term goals. Of course, I already wrote down important things before to not forget them and have them ready when I wanted to work on them. However, I scattered these notes across various places: sometimes as handwritten notes, sometimes in an app on my smartphone, sometimes a simple text editor on my desktop PC was sufficient, or they ended up in the cloud of some free web service. Ideally, even in my self-hosted Nextcloud instance. Thus, I successfully transformed the chaos in my head into digital clutter. Congratulations, Tim.

2024


Mastodon