merged
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using mistralai/Mistral-7B-v0.1 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
base_model: mistralai/Mistral-7B-v0.1
dtype: bfloat16
merge_method: dare_ties
modules:
default:
slices:
- sources:
- layer_range: [0, 32]
model: mistralai/Mistral-7B-v0.1
- layer_range: [0, 32]
model: samir-fama/SamirGPT-v1
parameters:
density: 0.53
weight: 0.4
- layer_range: [0, 32]
model: abacusai/Slerp-CM-mist-dpo
parameters:
density: 0.53
weight: 0.3
- layer_range: [0, 32]
model: EmbeddedLLM/Mistral-7B-Merge-14-v0.2
parameters:
density: 0.53
weight: 0.3
parameters:
int8_mask: 1.0
- Downloads last month
- 2
Model tree for ntegrals/NeuralMerge-9B-Dare
Merge model
this model