Home - golololologol/LLM-Distillery GitHub Wiki

LLM-Distillery Wiki

This project focuses on distillation of one or multiple teacher language models into a single student model. The goal is to collect general knowledge of various teacher models and encapsulate it into a more compact and efficient student model.

Table of Contents

Getting Started

Installation

Choose the guide below that applies to your system:

What to do next?

Check out this page to prepare everything for your first distillation run: Preparations for the first start

Contributing

Contributions are welcome! Feel free to open issues or submit pull requests.

License

This project is licensed under Apache License 2.0. See the LICENSE file for more details.