Efficiently get the length of the tokenized docs (#1063) 81d3845 unverified ricdomolm winglian commited on Jan 8
streaming multipack for pretraining dataset (#959) 553c80f unverified jinwonkim93 jinwonkim93@github.com winglian commited on Jan 6
feat: always push checkpoint to hub if set (#1049) [skip ci] cbdbf9e unverified Nanobit commited on Jan 5
feature: better device mapping for large models (#918) bdfefaf unverified dg-kalle Karl-Johan Alm winglian commited on Jan 5
Added chatglm3 conversation type for training models like TinyLLama (#1036) 59b2d30 unverified xaviviro commited on Jan 4
bump transformers and update attention class map name (#1023) bcc78d8 unverified winglian commited on Jan 3
[Docs] delete unused cfg value `lora_out_dir` (#1029) a3e8783 unverified hamel Nanobit commited on Jan 3
chore(readme): update instruction to set config to load from cache (#1030) b31038a unverified Nanobit commited on Jan 3
use recommended setting for use_reentrant w gradient checkpointing (#1021) 4d2e842 unverified winglian commited on Jan 2
Fix: bf16 support for inference (#981) 3678a6c unverified Tazik Shahjahan winglian commited on Dec 29, 2023
[WandB] Push axolotl config to top level wandb files (#1014) 4f4d638 unverified hamel commited on Dec 29, 2023
feat: remove need to add load_in* during merge (#1017) f6ecf14 unverified Nanobit commited on Dec 29, 2023
[Docs] Nit: Remind people to auth to wandb if they are going to use it (#1013) dec66d7 unverified hamel commited on Dec 29, 2023
remove landmark attn and xpos rope implementations (#1010) 70b46ca unverified winglian commited on Dec 28, 2023
Set eval_sample_packing to false in mistral config.yaml (#1003) 384b817 unverified Kevin Sydney commited on Dec 28, 2023
FEAT: add tagging support to axolotl (#1004) db9094d unverified Younes Belkada winglian commited on Dec 27, 2023
Add an example config for finetuning a 34B model on a 24GB GPU (#1000) 6ef46f8 unverified Evan Griffiths commited on Dec 25, 2023
set output_router_logits for mixtral config: (#995) 628b754 unverified winglian commited on Dec 22, 2023
Feat: Warns to add to modules_to_save when adding tokens or switching special_tokens (#787) 1ffa386 unverified Nanobit commited on Dec 22, 2023
update transformers to fix checkpoint saving (#963) f28e755 unverified dumpmemory commited on Dec 16, 2023
fix: switch to using the HuggingFace Transformers NEFT implementation (#941) ef24342 unverified dg-kalle commited on Dec 13, 2023