--- base_model: - ByteDance-Seed/Seed-OSS-36B-Instruct - ByteDance-Seed/Seed-OSS-36B-Base library_name: transformers tags: - mergekit - merge license: apache-2.0 --- This is a merge of Bytedance Seed-OSS-36B Base and Instruct, using the karcher-means method in [mergekit](https://github.com/cg123/mergekit), with the idea being to get Bytedance Instruct to 'feel' and write more like a raw continuation model. Karcher was tested because this and SLERP are seemingly the only viable ways to merge an instruct and base model. Quantized, it gets an MMLU score (via the exllamav3 eval script) of `11853/ 14042 = 84.41% correct, ( 80.41% prob.)` For reference, ByteDance's instruct model (with the exact same quantization settings) gets `11680/ 14042 = 83.18% correct, ( 80.96% prob.)` The base model by itself: `11851/ 14042 = 84.40% correct, ( 76.96% prob.)` These are the raw bf16 weights of the merge. ## Merge Details ### Merge Method This model was merged using the [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) merge method using /home/alpha/Models/Raw/ByteDance-Seed_Seed-OSS-36B-Instruct as a base. ### Models Merged The following models were included in the merge: * /home/alpha/Models/Raw/ByteDance-Seed_Seed-OSS-36B-Base ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /home/alpha/Models/Raw/ByteDance-Seed_Seed-OSS-36B-Base - model: /home/alpha/Models/Raw/ByteDance-Seed_Seed-OSS-36B-Instruct merge_method: karcher tokenizer: source: "base" base_model: /home/alpha/Models/Raw/ByteDance-Seed_Seed-OSS-36B-Instruct parameters: int8_mask: true dtype: bfloat16 ```