4 results for tag "knowledge-distillation"
A large collection of Claude Code skill templates sponsored by Z.AI, providing ready-to-use development skill configurations across various domains.
A knowledge distillation skill from the AI Research Engineering Skills Library, the most comprehensive open-source collection of AI research engineering skills for AI agents.
Compresses large language models by transferring knowledge from teacher to student models, reducing model size and inference costs while maintaining performance.