don't strip the prompt for check since we don't strip to tokenize anymore (#650) 8662e8f unverified winglian commited on Sep 28, 2023
fix for flash attn w mistral w/o sammple packing (#648) b2edaae unverified winglian commited on Sep 28, 2023
Fix(cfg): Add validation for save_strategy and eval_strategy (#633) 383f88d unverified Nanobit commited on Sep 28, 2023
skip some flash attn patches unless explicitly enabled (#643) 895f0a0 unverified winglian commited on Sep 27, 2023
Fix: Fail bf16 check when running on cpu during merge (#631) cfbce02 unverified Nanobit commited on Sep 25, 2023
better handling and logging of empty sharegpt turns (#603) a363604 unverified winglian commited on Sep 22, 2023
chore(callback): Remove old peft saving code (#510) d5f8589 unverified Nanobit commited on Sep 22, 2023
run eval on the first step to get a baseline (#617) 2844eb2 unverified winglian commited on Sep 22, 2023
skip the gpu memory checks if the device is set to 'auto' (#609) 196ff11 unverified winglian commited on Sep 21, 2023
improve handling for empty text on the tokenization step (#502) 1eebbd0 unverified winglian commited on Sep 19, 2023
btlm and falcon monkey patches for flash attn (#566) 6b9b229 unverified winglian commited on Sep 17, 2023
Feat(data): Allow loading local csv and text (#594) 00dce35 unverified Nanobit commited on Sep 17, 2023
gather/broadcast the max value of the packing efficiency automatically (#463) b15b19e unverified winglian commited on Sep 17, 2023
optionally configure sample packing for evals (#589) 21ec195 unverified winglian commited on Sep 16, 2023
set fsdp state dict (#584) be75668 unverified Jan Philipp Harries Jan Philipp Harries commited on Sep 15, 2023
don't resize embeddings if it's already large enough (#577) 3607882 unverified winglian commited on Sep 15, 2023
support custom field for completion from yml (#580) f7a2263 unverified winglian commited on Sep 15, 2023
prevent cli functions from getting fired on import (#581) 8dcd40a unverified winglian commited on Sep 15, 2023
refactor scripts/finetune.py into new cli modules (#550) 861ceca unverified winglian Nanobit commited on Sep 15, 2023
remove columns after tokenizing for pretraining (#571) 1157950 unverified winglian commited on Sep 14, 2023
fix save_steps so it doesn't get duplicated (#567) 3fbde76 unverified winglian commited on Sep 14, 2023