Edit model card

VerwaltungsAnthologie_clear2_7B

This model is used as an intermediate model for future merges. It is a merge of 4 pre-trained language models based upon Mistral-7B-v0.1 created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using mistralai/Mistral-7B-v0.1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

# works but never stops
models:
  - model: mistralai/Mistral-7B-v0.1
    # No parameters necessary for base model
  - model: VAGOsolutions/SauerkrautLM-7b-LaserChat
    parameters:
      density: 0.53
      weight: 0.225
  - model: hiig-piai/simba-v01c
    parameters:
      density: 0.53
      weight: 0.55
  - model: DRXD1000/Phoenix
    parameters:
      density: 0.53
      weight: 0.225
merge_method: dare_ties
base_model: mistralai/Mistral-7B-v0.1
parameters:
  int8_mask: true
dtype: bfloat16
name: VerwaltungsAnthologie_clear2_7B

Downloads last month
8
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MarcGrumpyOlejak/VerwaltungsAnthologie_clear2_7B