- The calibration dataset is from VatsaDev/worldbuild.
- The measurement file is attached in the branch
measurement
. - Perplexity:
- calibration:
Module quantized, calibration perplexity (quant): 14.6905
- wikitext-103-v1:
Evaluation perplexity: 12.7281
- calibration:
Meidebenne-120b-v1.0
This is a Frankenstein merge of sophosympatheia/Midnight-Rose-70B-v1.0 created using mergekit.
Hence, the name of this merged model, 'Meidebenne', is inspired by a variety of the deeper Midnight-Rose species.
For detailed instructions, it is recommended to refer to the original page of the sophosympatheia/Midnight-Rose-70B-v1.0 repository.
Merge Details
Merge Method
This model was merged using the passthrough merge method, which is exactly copied from nsfwthrowitaway69/Venus-120b-v1.2 / cognitivecomputations/MegaDolphin-120b.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: float16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 20]
model: "sophosympatheia/Midnight-Rose-70B-v1.0"
- sources:
- layer_range: [10, 30]
model: "sophosympatheia/Midnight-Rose-70B-v1.0"
- sources:
- layer_range: [20, 40]
model: "sophosympatheia/Midnight-Rose-70B-v1.0"
- sources:
- layer_range: [30, 50]
model: "sophosympatheia/Midnight-Rose-70B-v1.0"
- sources:
- layer_range: [40, 60]
model: "sophosympatheia/Midnight-Rose-70B-v1.0"
- sources:
- layer_range: [50, 70]
model: "sophosympatheia/Midnight-Rose-70B-v1.0"
- sources:
- layer_range: [60, 80]
model: "sophosympatheia/Midnight-Rose-70B-v1.0"
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.