Frankie-F / mergekit_config.yml
balnazzar's picture
Upload folder using huggingface_hub
00cd4ce verified
raw
history blame
269 Bytes
slices:
- sources:
- model: TinyLlama/TinyLlama-1.1B-intermediate-step-1195k-token-2.5T
layer_range: [0, 16]
- sources:
- model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
layer_range: [6, 22]
merge_method: passthrough
dtype: float16