flash attn pip install (#426) cf66547 unverified mhenrichsen Ubuntu mhenrichsen Mads Henrichsen winglian commited on Aug 18, 2023
Attention mask and position id fixes for packing (#285) 2bb0b78 unverified winglian commited on Aug 12, 2023
Merge pull request #355 from tmm1/bitsandbytes-fixes 35c8b90 unverified tmm1 commited on Aug 11, 2023
latest HEAD of accelerate causes 0 loss immediately w FSDP (#321) 9f69c4d unverified winglian commited on Jul 24, 2023
update docker to compile latest bnb to properly support qlora 312b8d5 winglian commited on May 27, 2023
quickstart instructions for starting from runpod (#5) 0a472e1 unverified winglian commited on Apr 18, 2023
config chooser, update readme instructions, device config, llama flash attention, debug out the labels, fix config key checks, other bugfixes f2a2029 winglian commited on Apr 14, 2023