Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
Micko's picture
1 5

Micko

Chovus
ยท

AI & ML interests

None yet

Recent Activity

reacted to DavidAU's post with ๐Ÿ”ฅ about 11 hours ago
SAVANT COMMANDER: 48B-A4B , 256k Context, GATED MOE. I am going to showcase some other people's tuning work, that I have put into a GATED Distill MOE (Qwen3) ; 256 K context. Special thanks to all the tuners (listed in the model tree and repo page with special shoutout to "TeichAI" - using Unsloth for a lot of the Distills in this model): Savant Commander is a specialized MOE model that allows you to control which expert(s) (of 12) are assigned to your use case(s) / prompt(s) ... directly (by name(s)), as opposed to having the "choices" made for you. The model is composed of 12 DISTILLS (compressed 12x4B MOE) of top closed (GPT5.1, OpenAI 120 GPT Oss, Gemini (3), Claude (2) ) and open source models (Kimi, GLM, Deepseek, Command-A, JanV1 ) all in one. 256k Context, 2 experts activated. PS: There is also a "heretic" / "decensored" version too ; listed on this model page. https://huggingface.co/DavidAU/Qwen3-48B-A4B-Savant-Commander-GATED-12x-Closed-Open-Source-Distill-GGUF
upvoted an article 10 days ago
Transformers v5: Simple model definitions powering the AI ecosystem
liked a Space about 2 months ago
HuggingFaceTB/smol-training-playbook
View all activity

Organizations

None yet

Chovus 's models

None public yet
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs