File size: 1,597 Bytes
4932fe5
 
 
 
 
 
6c331bd
 
76e4396
 
4932fe5
cae4b40
efb3463
7ebf0ce
efb3463
83a2196
efb3463
affbb82
efb3463
83a2196
efb3463
83a2196
5902390
83a2196
 
efb3463
83a2196
5902390
83a2196
5902390
efb3463
83a2196
5902390
efb3463
5902390
efb3463
 
 
 
83a2196
f3e1687
efb3463
83a2196
efb3463
83a2196
0f340b5
 
 
 
efb3463
83a2196
efb3463
83a2196
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
license: mit
language:
- en
base_model:
- meta-llama/Llama-3.1-70B-Instruct
tags:
- axolotl
datasets:
- NarrativAI/CakrawalaRP
---
# 🎭 Cakrawala-Llama-3.1-70B

> *Where Worlds Converge and Adventures Begin!*

## 🌟 What's Special About This Model?

Cakrawala-Llama-3.1-70B is a fine-tuned variant of the Llama-3.1-70B-Instruct model, specifically optimised for generating rich roleplaying conversations and character interactions. The model has been trained to excel at producing detailed, contextually appropriate character dialogues with rich descriptions of physical actions, expressions, and emotional states while maintaining consistent character voices and perspectives throughout extended interactions. 

## 🧪 The Secret Sauce

### Training Diet:
- Fed with 13,000 conversation pairs
- Each conversation is a minimum 12-13 turns long
- Focused heavily details like facial expressions, environmental descriptions, and character reactions that are focused a lot on **keeping the model in character.**

### Tech Wizardry:
- Trained on Llama-3.1-70B-Instruct
- Fine-tuned using QLoRA 
- Trained over 2 epochs

## Training Parameters
- Gradient Accumulation Steps: 1
- Micro Batch Size: 4
- Learning Rate: 0.0002
- Optimizer: AdamW
- Scheduler: Cosine
- Mixed Precision: BF16 & FP16 with TF32 support

## 🔧 Under the Hood
- Trained on 8 x H100 SXM GPUs

## 🎬 License & Credits

- Licensed under MIT 
- Based on meta-llama/Llama-3.1-70B-Instruct
- 
## GGUF Quants
- https://huggingface.co/mradermacher/Cakrawala-70B-GGUF

---

*Built with ❤️ for roleplayers, by roleplayers*