File size: 994 Bytes
17b266c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 |
---
language:
- en
tags:
- NLP
license: mit
datasets:
- TristanBehrens/bach_garland_2024-100K
base_model: None
---
# bach_garland_mamba - An xLSTM Model
![Trained with Helibrunna](banner.jpg)
Trained with [Helibrunna](https://github.com/AI-Guru/helibrunna) by [Dr. Tristan Behrens](https://de.linkedin.com/in/dr-tristan-behrens-734967a2).
## Configuration
```
training:
model_name: bach_garland_mamba
batch_size: 28
lr: 0.001
lr_warmup_steps: 1428
lr_decay_until_steps: 14285
lr_decay_factor: 0.001
weight_decay: 0.1
amp_precision: bfloat16
weight_precision: float32
enable_mixed_precision: true
num_epochs: 8
output_dir: output/bach_garland_mamba
save_every_step: 500
log_every_step: 10
wandb_project: bach_garland
torch_compile: false
model:
type: mamba
d_model: 64
n_layers: 4
context_length: 4096
vocab_size: 178
dataset:
hugging_face_id: TristanBehrens/bach_garland_2024-100K
tokenizer:
type: whitespace
fill_token: '[EOS]'
```
|