Attention mask and position id fixes for packing (#285) 2bb0b78 unverified winglian commited on Aug 12, 2023
Update XFormers Attention Monkeypatch to handle Llama-2 70B (GQA) (#339) 10405b9 unverified ssmi153 commited on Aug 6, 2023
fix sdp attention to use the flash/mem-efficient context manaager a032c9f winglian commited on Jul 20, 2023
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py 1076bcb unverified winglian Nanobit commited on May 31, 2023
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py 2daa683 unverified winglian Nanobit commited on May 31, 2023
copy xformers attn from ooba since we removed dep on alpaca_lora_4bit 6cb2310 winglian commited on May 31, 2023