Respect sequence_len in config for `type: llama2_chat` (#926) f1de29d unverified hamel commited on Dec 12, 2023
better handling and logging of empty sharegpt turns (#603) a363604 unverified winglian commited on Sep 22, 2023
fix test fixture b/c hf trainer tokenization changed (#464) d5dcf9c unverified winglian commited on Aug 23, 2023
fix fixture for new tokenizer handling in transformers (#428) 8cace80 unverified winglian commited on Aug 17, 2023
experimental llama 2 chat support (#296) 3392270 unverified Jan Philipp Harries Jan Philipp Harries commited on Aug 6, 2023
fix packing so that concatenated sequences reset the attention 9b8585d winglian commited on May 31, 2023