Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Dovakiins
/
qwerrwe
like
0
Build error
App
Files
Files
Community
3678a6c
qwerrwe
/
src
/
axolotl
/
monkeypatch
100 contributors
History:
48 commits
winglian
remove landmark attn and xpos rope implementations (#1010)
70b46ca
unverified
11 months ago
mixtral
Mixtral official (#942)
12 months ago
btlm_attn_hijack_flash.py
Safe
2.32 kB
flash_attention + sample packing for stablelm 3b (#671)
about 1 year ago
fastchat_conversation_turns.py
Safe
8.04 kB
fix mistral prompt assembly (#982)
12 months ago
llama_attn_hijack_flash.py
Safe
27.1 kB
adds llama and mistral dropout support (#858)
about 1 year ago
llama_attn_hijack_sdp.py
Safe
4.81 kB
various bugfixes (#856)
about 1 year ago
llama_attn_hijack_xformers.py
Safe
5.69 kB
various bugfixes (#856)
about 1 year ago
llama_expand_mask.py
Safe
1.92 kB
Attention mask and position id fixes for packing (#285)
over 1 year ago
mistral_attn_hijack_flash.py
Safe
22.4 kB
adds llama and mistral dropout support (#858)
about 1 year ago
relora.py
Safe
14 kB
fix checkpints on multigpu (#481)
over 1 year ago
stablelm_attn_hijack_flash.py
Safe
15.4 kB
flash_attention + sample packing for stablelm 3b (#671)
about 1 year ago
utils.py
Safe
4.2 kB
Implement fused modules (#747)
about 1 year ago