Skip to content

AI model distillation (learning)

Field Value
Status Active
Type Personal learning / research

Description

A personal learning project on model distillation: training smaller or cheaper student models to approximate a larger teacher (or ensemble), including classic knowledge distillation, logits matching, and related compression ideas. Goal is hands-on understanding of how distilled models behave, fail, and trade off quality vs. cost — not a product launch.

Structured multi-phase plan (full verbatim checklist): raw/ai-model-distillation-learning-plan.md.

Learning plan (progress)

Phase Focus Milestone Status
1 Foundation: Moonshot (Alex Wissner-Gross), paper(s), distillation-minded hands-on (see education-startup-ux / Boxy) Grounding + one applied thread Done
2 Generalize beyond the podcast via LLM prompts; save generalization doc; tie to your skills Personalized 2–3 page summary Not started
3 Read 8–12 resources: Hinton KD, distilling step-by-step, HF guides, OpenAI distillation tutorial, DistillKit/notebooks, search + alerts Notion/Zotero (or equivalent) with highlights + snippets Not started
4 Build a narrow student (OpenAI distillation path or open-source teacher → HF/DistillKit); deploy + case study Live demo + 1-page “cost / niche metric” writeup Not started

Phase 2 (next) — outline

  • One broad prompt: mechanisms (response / logit KD / synthetic data), economics, beyond LLMs, iterated loops, examples (e.g. Phi, OpenAI distillation API), 3–5 student niche ideas.
  • 3–5 follow-ups (e.g. apply to medicine, law, coding).
  • Deliverable: full thread saved + 2–3 page personalized summary.

Phase 3 — outline

  • Papers: Hinton 2015 “Distilling the Knowledge…”; “Distilling step-by-step” (Google + Snorkel, 2023); HF KD + CV distillation material.
  • Hands-on tutorials: OpenAI distillation guide (GPT-4o → mini); Snorkel/Labelbox intros; Arcee DistillKit / Nebius notebooks.
  • Search habit: HF + synthetic-data queries; arXiv (“scaling laws”, “reinforcement-aware KD”); alerts/RSS for “model distillation.”

Phase 4 — outline

  • Pick a narrow problem where you have edge (domain reviewer, tutor, clause explainer, etc.).
  • Fast path: OpenAI API, synthetic data from teacher, distill to mini, fine-tune, held-out eval.
  • OSS path: Large teacher on Groq/HF, synthetic rationales, Transformers + DistillKit / distillation Trainer, optional quantize/prune.
  • Ship: Space/Vercel/Replicate, short social post + 1-page case study.

Origin / how it connected to work

  • Sources: Listened repeatedly to Moonshot episodes featuring Alex Wissner-Gross’s explanation of model distillation, plus reading a paper on the topic.
  • Application: Tried a rudimentary, distillation-minded approach on the education startup problem — on-device coaching (smaller / adapted models, including Distil-Whisper-style paths and lighter transcription) alongside a custom baby-coo detector — documented under education-startup-ux as the Boxy POC.
  • Lesson: For full video coaching quality, Gemini on uploaded video already met the bar; the local stack was valuable for feasibility + UX experiments, not as the long-term substitute for that coaching depth.

Key features / scope

  • Theory + practice: core papers, podcasts, and targeted experiments (including work-adjacent POCs where ethical and appropriate).
  • Distinction: This page tracks personal learning; Boxy implementation detail lives on education-startup-ux.

Tech stack

  • Study: general PyTorch / HF / PEFT ecosystem; Whisper family and distilled variants appeared in the work POC (see education page).
  • Phase 4: OpenAI distillation API vs. DistillKit / HF — choose when you start build.
  • Add dedicated personal repo or notebook links in raw/ when you want them canonical.

Raw sources

  • raw/ai-model-distillation-learning-plan.md — phased checklist (Phases 2–4 verbatim + Phase 1 note).
  • raw/education-startup-boxy-poc.md — technical thread (Distil-Whisper LoRA, coo CNN, Core ML) tied to the work POC informed by this study.

Known issues

  • Phase 2–4 deliverables (generalization doc, Zotero/Notion library, shipped student + case study) not linked here yet.