qwerrwe / examples /cerebras

Commit History

Update qlora.yml - remove `max_packed_sequence_len` (#1210) [skip ci]
5407ddd
unverified

7flash commited on

set fp16 to false if bf16, update bf16: auto in example YAMLs (#1122) [skip ci]
782b6a4
unverified

winglian Nanobit commited on

new evals_per_epoch and saves_per_epoch to make things cleaner (#944)
5f79b82
unverified

winglian commited on

Feat(wandb): Refactor to be more flexible (#767)
a1da39c
unverified

Nanobit commited on

don't compile deepspeed or bitsandbytes from source (#837)
f544ab2
unverified

winglian commited on

fix eval_steps to be a sane default (#797)
8b79ff0
unverified

winglian commited on

simplify by removing duplicate base_model_config (#772)
2d8def6
unverified

winglian commited on

prepared dataset caching, other misc fixes (#665)
e50a64e
unverified

winglian commited on

btlm and falcon monkey patches for flash attn (#566)
6b9b229
unverified

winglian commited on

Add wandb_entity to wandb options, update example configs, update README (#361)
7019509
unverified

Morgan McGuire Morgan McGuire winglian commited on

set group_by_length to false in examples
36fefcf

tmm1 commited on

more pruning
effbbf6

winglian commited on