--- base_model: - SuperbEmphasis/MN-12b-RP-Ink-RP-Longform - Sicarius-Prototyping/Impish_Longtail_12B library_name: transformers tags: - mergekit - merge license: apache-2.0 --- # Impish-LongPen-12B A **karcher** merge of [Sicarius-Prototyping/Impish_Longtail_12B](https://huggingface.co/Sicarius-Prototyping/Impish_Longtail_12B) and [SuperbEmphasis/MN-12b-RP-Ink-RP-Longform](https://huggingface.co/SuperbEmphasis/MN-12b-RP-Ink-RP-Longform) used in [KansenSakura-Erosion-RP-12b](https://huggingface.co/Retreatcost/KansenSakura-Erosion-RP-12b) But with better quality. ![screenshot](https://cdn-uploads.huggingface.co/production/uploads/6671dd5203d6e8087aaf7ce5/kzDUFX_MLtP1BMvoi7CDx.png) The merge itself took long ass time, probably not going to repeat similar experiments. Expect more experimental models in the meantime. This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) merge method. ### Models Merged The following models were included in the merge: * [SuperbEmphasis/MN-12b-RP-Ink-RP-Longform](https://huggingface.co/SuperbEmphasis/MN-12b-RP-Ink-RP-Longform) * [Sicarius-Prototyping/Impish_Longtail_12B](https://huggingface.co/Sicarius-Prototyping/Impish_Longtail_12B) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: karcher models: - model: SuperbEmphasis/MN-12b-RP-Ink-RP-Longform - model: Sicarius-Prototyping/Impish_Longtail_12B parameters: max_iter: 100000 tol: 1e-9 dtype: bfloat16 ```