simplify by removing duplicate base_model_config (#772) 2d8def6 unverified winglian commited on Oct 23, 2023
Fix: Warn when fullfinetune without adapter (#770) 44c9d01 unverified Nanobit commited on Oct 22, 2023
convert exponential notation lr to floats (#771) ca84cca unverified winglian commited on Oct 22, 2023
Fix: eval table conflict with eval_sample_packing (#769) 9923b72 unverified Nanobit commited on Oct 22, 2023
Feat: Allow usage of native Mistral FA when no sample_packing (#669) 697c50d unverified Nanobit commited on Oct 4, 2023
Fix(cfg): Add validation for save_strategy and eval_strategy (#633) 383f88d unverified Nanobit commited on Sep 28, 2023
Fix: Fail bf16 check when running on cpu during merge (#631) cfbce02 unverified Nanobit commited on Sep 25, 2023
better handling and logging of empty sharegpt turns (#603) a363604 unverified winglian commited on Sep 22, 2023
Fix pretraining with iterable/streaming Dataset (#556) 2f586d1 unverified Jan Philipp Harries Jan Philipp Harries commited on Sep 13, 2023
recommend padding when using sample packing (#531) 3437149 unverified winglian commited on Sep 6, 2023
fix test fixture b/c hf trainer tokenization changed (#464) d5dcf9c unverified winglian commited on Aug 23, 2023
fix fixture for new tokenizer handling in transformers (#428) 8cace80 unverified winglian commited on Aug 17, 2023
Attention mask and position id fixes for packing (#285) 2bb0b78 unverified winglian commited on Aug 12, 2023
experimental llama 2 chat support (#296) 3392270 unverified Jan Philipp Harries Jan Philipp Harries commited on Aug 6, 2023
update prompts for open orca to match the paper (#317) 3d4984b unverified winglian commited on Jul 22, 2023
Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var b1f4f7a theobjectivedad commited on Jul 15, 2023
Merge pull request #214 from OpenAccess-AI-Collective/fix-tokenizing-labels 1925eaf unverified winglian commited on Jun 15, 2023
Update doc for grad_accu and add validation tests for batch size 3c71c8d Nanobit commited on May 31, 2023
fix packing so that concatenated sequences reset the attention 9b8585d winglian commited on May 31, 2023