qwerrwe / src /axolotl /utils /trainer.py

Commit History

fix eval steps and strategy (#403)
da10af0
unverified

winglian commited on

Feat(config): add max steps (#387)
3c2ad00
unverified

ittailup commited on

Added "epoch" evaluation_strategy (#388)
5d48a10
unverified

flotos commited on

Feat(config): Add hub_strategy (#386)
73a0b6e
unverified

Nanobit commited on

improve GPU logging to break out pytorch cache and system mem
7b55fe6

tmm1 commited on

Attention mask and position id fixes for packing (#285)
2bb0b78
unverified

winglian commited on

log GPU memory usage
e303d64

tmm1 commited on

fix axolotl training args dataclass annotation
ebaec3c

winglian commited on

Merge branch 'OpenAccess-AI-Collective:main' into logging_enhancement
83237b8
unverified

The Objective Dad commited on

Merge pull request #274 from OpenAccess-AI-Collective/NanoCode012-patch-2
168a7a0
unverified

Nanobit commited on

Feat: Add save_safetensors
5491278

Nanobit commited on

Set push to hub as private by default
1514739
unverified

Nanobit commited on

Merge branch 'main' into quadratic-warmup
c4cf567
unverified

winglian commited on

better configuration for quadratic warmup
c49729d

winglian commited on

Fix future deprecation push_to_hub_model_id
e79c8e6

Nanobit commited on

push intermediate model checkpoints to hub
612aabd

winglian commited on

support adamw and grad norm hyperparams
6d0ee4b

winglian commited on

add axolotl trainer and quadratic warmup
7dc580b

winglian commited on

Merge branch 'main' into flash-optimum
fd2c981
unverified

winglian commited on

Fix set mem_id for inference and refactor
974dc00

Nanobit commited on

fix formatting
958da70

winglian commited on

address PR feedback
0c6f928

winglian commited on

fix bettertransformers save, force it to skip after saving correctly in callback
1a82082

winglian commited on

more tweaks to do pre-training with bettertransformers
1210dc8

winglian commited on

Feat: Add landmark attention
55b8542

Nanobit commited on

Refactor out unmodified save_steps and eval_steps
2ef4634

Nanobit commited on

Set to use cfg.seed or 42 for backward compat
2cfe9e9

Nanobit commited on

fix relative path for fixtures
cfcc549

winglian commited on

Apply isort then black
37293dc

Nanobit commited on

Fix mypy typing
e9650d3

Nanobit commited on

Lint trainer.py
ddb86ea

Nanobit commited on

fix relative path for fixtures
e65aeed

winglian commited on

refactor(param): rename load_4bit config param by gptq
dd00657

Thytu commited on

fixes to make qlora actually work
34c99f9

winglian commited on

apply black formatting
ce34d64

winglian commited on

fix missing fp16 kwarg
2ae936f

winglian commited on

Add qa style data for alpaca instructions, fix one_cycle scheduler
3a50377

winglian commited on

don't need to set here
de6da13

winglian commited on

be able to use adam bnb 8bit and one cycle scheduler w fsdp
9493b1b

winglian commited on

make one cycle lr div factor configurable
99383f1

winglian commited on

Merge branch 'main' into patch-2
89b7f26
unverified

Nanobit commited on

black formatting
2bc1a5b

winglian commited on

various fixes
7a490a4

winglian commited on

Fix Trainer() got multiple values for keyword argument 'callbacks'
813aab3
unverified

Nanobit commited on

Merge pull request #21 from NanoCode012/patch-1
bd3c5a5
unverified

winglian commited on

Update trainer.py
36aaea0
unverified

Nanobit commited on

Fix condition scheduler
5b6690a
unverified

Nanobit commited on

Add callbacks to Trainer
cc77bab

Nanobit commited on

Add callback save peft_model on_save
0d6708b

Nanobit commited on