L3.1-Start-10B / mergekit_config.yml
kromeurus's picture
Upload folder using huggingface_hub
d73a532 verified
raw
history blame contribute delete
164 Bytes
dtype: bfloat16
merge_method: passthrough
parameters:
int8_mask: 1.0
slices:
- sources:
- layer_range: [0, 42]
model: merge/chubby10b+loras/Baldur-r128-LoRA