metadata
license: apache-2.0
base_model: mistralai/Mixtral-8x7B-Instruct-v0.1
base_model_relation: quantized
quantized_by: turboderp
tags:
- exl3
EXL3 quants of Mixtral-8x7B-Instruct-v0.1
2.00 bits per weight
2.50 bits per weight
3.00 bits per weight
3.50 bits per weight
4.00 bits per weight
5.00 bits per weight
6.00 bits per weight
8.00 bits per weight / H8
(Couldn't find any GGUF or AWQ versions that still work.)
