ZeroXClem commited on
Commit
f9934d6
•
1 Parent(s): 24ea6c2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -17,14 +17,14 @@ pipeline_tag: text-generation
17
  library_name: transformers
18
  ---
19
 
20
- # ZeroXClem/L3.1-Aspire-Heart-Matrix-8B
21
 
22
- **ZeroXClem/L3.1-Aspire-Heart-Matrix-8B** is an experimental language model crafted by merging three high-quality 8B parameter models using the **Model Stock Merge** method. This synthesis leverages the unique strengths of Aspire, Heart Stolen, and CursedMatrix, creating a highly versatile and robust language model for a wide array of tasks.
23
 
24
 
25
  ## 🌟 Model Details
26
 
27
- - **Name:** `ZeroXClem/L3.1-Aspire-Heart-Matrix-8B`
28
  - **Base Model:** `Khetterman/CursedMatrix-8B-v9`
29
  - **Merge Method:** `Model Stock`
30
  - **Parameter Count:** `8 billion`
@@ -94,7 +94,7 @@ This model is compatible with popular inference frameworks, including:
94
  ```python
95
  from transformers import AutoTokenizer, AutoModelForCausalLM
96
 
97
- model_name = "ZeroXClem/Qwen2.5-7B-Qandora-CySec"
98
  tokenizer = AutoTokenizer.from_pretrained(model_name)
99
  model = AutoModelForCausalLM.from_pretrained(model_name)
100
 
 
17
  library_name: transformers
18
  ---
19
 
20
+ # ZeroXClem/L3-Aspire-Heart-Matrix-8B
21
 
22
+ **ZeroXClem/L3-Aspire-Heart-Matrix-8B** is an experimental language model crafted by merging three high-quality 8B parameter models using the **Model Stock Merge** method. This synthesis leverages the unique strengths of Aspire, Heart Stolen, and CursedMatrix, creating a highly versatile and robust language model for a wide array of tasks.
23
 
24
 
25
  ## 🌟 Model Details
26
 
27
+ - **Name:** `ZeroXClem/L3-Aspire-Heart-Matrix-8B`
28
  - **Base Model:** `Khetterman/CursedMatrix-8B-v9`
29
  - **Merge Method:** `Model Stock`
30
  - **Parameter Count:** `8 billion`
 
94
  ```python
95
  from transformers import AutoTokenizer, AutoModelForCausalLM
96
 
97
+ model_name = "ZeroXClem/L3-Aspire-Heart-Matrix-8B"
98
  tokenizer = AutoTokenizer.from_pretrained(model_name)
99
  model = AutoModelForCausalLM.from_pretrained(model_name)
100