Fix: ValueError when FA + Mistral when padding_side=right (#681) eb480df unverified Nanobit commited on Oct 5, 2023
Fix: Future deprecation warning with use_auth_token (#680) 69fac9a unverified Nanobit commited on Oct 5, 2023
Fix(tokenizer): Set rstrip,lstrip,norm to False (#678) e0b7eea unverified Nanobit commited on Oct 5, 2023
Fix(version): Update FA to work with Mistral SWA (#673) 43856c0 unverified Nanobit commited on Oct 4, 2023
Feat: Allow usage of native Mistral FA when no sample_packing (#669) 697c50d unverified Nanobit commited on Oct 4, 2023
Feat: Add config yaml to section for reprod in bug-report.yaml (#667) 90e0d67 unverified Nanobit commited on Oct 3, 2023
refactor to set eval_batch_size earlier if unset, so we can warn if mismatched (#662) 2642cae unverified winglian commited on Oct 3, 2023
prepared dataset caching, other misc fixes (#665) e50a64e unverified winglian commited on Oct 3, 2023
make sure we also run CI tests when requirements.txt changes (#663) f4868d7 unverified winglian commited on Oct 2, 2023
don't strip the prompt for check since we don't strip to tokenize anymore (#650) 8662e8f unverified winglian commited on Sep 28, 2023
fix for flash attn w mistral w/o sammple packing (#648) b2edaae unverified winglian commited on Sep 28, 2023
Fix(cfg): Add validation for save_strategy and eval_strategy (#633) 383f88d unverified Nanobit commited on Sep 28, 2023
skip some flash attn patches unless explicitly enabled (#643) 895f0a0 unverified winglian commited on Sep 27, 2023
eval_table isn't quite stable enough to be in default llama configs (#637) d887ad8 unverified winglian commited on Sep 26, 2023
Added quotes to the pip install -e command to fix an incompatibility with shells that do glob expansion like zsh (#632) 5e5296a unverified Fernando Tarin Morales commited on Sep 25, 2023
Merge pull request #629 from OpenAccess-AI-Collective/chore/-change-default-model f3d9390 unverified mhenrichsen commited on Sep 25, 2023
Fix: Fail bf16 check when running on cpu during merge (#631) cfbce02 unverified Nanobit commited on Sep 25, 2023
tweak: improve base builder for smaller layers (#500) 923eb91 unverified Maxime commited on Sep 22, 2023
better handling and logging of empty sharegpt turns (#603) a363604 unverified winglian commited on Sep 22, 2023
chore(callback): Remove old peft saving code (#510) d5f8589 unverified Nanobit commited on Sep 22, 2023
run eval on the first step to get a baseline (#617) 2844eb2 unverified winglian commited on Sep 22, 2023
let MAX_JOBS use the default since we're not resource constrained on our self-hosted runners (#427) e85d2eb unverified winglian commited on Sep 22, 2023
skip the gpu memory checks if the device is set to 'auto' (#609) 196ff11 unverified winglian commited on Sep 21, 2023