Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Dovakiins
/
qwerrwe
like
0
Build error
App
Files
Files
Community
32580c1
qwerrwe
/
tests
100 contributors
History:
96 commits
winglian
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
782b6a4
unverified
10 months ago
core
add gptneox embeddings, fix phi2 inputs, also fix the casting (#1083)
11 months ago
e2e
Multipack simplify for Mixtral (#1142)
11 months ago
fixtures
Respect sequence_len in config for `type: llama2_chat` (#926)
12 months ago
monkeypatch
Multipack simplify for Mixtral (#1142)
11 months ago
prompt_strategies
fix: `train_on_inputs: true` ignored for sharegpt (#1045) [skip ci]
11 months ago
utils
Add shifted sparse attention (#973) [skip-ci]
11 months ago
test_data.py
Safe
2.23 kB
Fix pretraining with iterable/streaming Dataset (#556)
about 1 year ago
test_dict.py
Safe
3.17 kB
fix DefaultDict.__or__
over 1 year ago
test_expand_mask.py
Safe
1.43 kB
Attention mask and position id fixes for packing (#285)
over 1 year ago
test_normalize_config.py
Safe
3 kB
set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
10 months ago
test_packed_dataset.py
Safe
2.28 kB
Attention mask and position id fixes for packing (#285)
over 1 year ago
test_packed_pretraining.py
Safe
2.4 kB
streaming multipack for pretraining dataset (#959)
11 months ago
test_prompt_tokenizers.py
Safe
16.3 kB
fix mistral prompt assembly (#982)
12 months ago
test_prompters.py
Safe
4.39 kB
Attention mask and position id fixes for packing (#285)
over 1 year ago
test_tokenizers.py
Safe
1.91 kB
Feat: Warns to add to modules_to_save when adding tokens or switching special_tokens (#787)
11 months ago
test_validation.py
Safe
22.4 kB
Deprecate max packed sequence len (#1141)
11 months ago