Commit History
Adding `safetensors` variant of this model (#1)
34699d1
Upload README.md
ab77a3c
Update README.md
476225f
kingabzpro
commited on
Update README.md
d036b29
kingabzpro
commited on
Update tokenizer_config.json
32ccf9b
kingabzpro
commited on
Update README.md
8ca2db9
kingabzpro
commited on
Test LM Results
31e7005
kingabzpro
commited on
Tokens and LM
2490fe6
kingabzpro
commited on
New Model
abab422
kingabzpro
commited on
update model card README.md
87ff56f
kingabzpro
commited on
End of training
cfde571
kingabzpro
commited on
Training in progress, step 2400
07c408e
kingabzpro
commited on
Training in progress, step 2000
aeaefc4
kingabzpro
commited on
Training in progress, step 1600
c3e46e3
kingabzpro
commited on
Training in progress, step 1200
e0b50ad
kingabzpro
commited on
Training in progress, step 800
db0005c
kingabzpro
commited on
Training in progress, step 400
f4829dc
kingabzpro
commited on
add tokenizer
f94783e
kingabzpro
commited on
Training in progress, step 200
c37121e
kingabzpro
commited on
Training in progress, step 150
12da867
kingabzpro
commited on
Training in progress, step 100
8dab743
kingabzpro
commited on
Training in progress, step 50
1042cde
kingabzpro
commited on
add tokenizer
e5a46eb
kingabzpro
commited on
Training in progress, step 100
3fec20b
kingabzpro
commited on
Training in progress, step 50
4368349
kingabzpro
commited on
add tokenizer
4459c58
kingabzpro
commited on
Update README.md
c396a11
kingabzpro
commited on
lm-boosted decoder
76358ba
kingabzpro
commited on
Update README.md
ad105b1
kingabzpro
commited on
update model card README.md
3650a61
kingabzpro
commited on
Training in progress, step 1200
560a92b
kingabzpro
commited on
Training in progress, step 1100
8e22929
kingabzpro
commited on
Training in progress, step 1000
2075e30
kingabzpro
commited on
Training in progress, step 900
7cb4d0c
kingabzpro
commited on
Training in progress, step 800
858e785
kingabzpro
commited on
Training in progress, step 700
45fa6b2
kingabzpro
commited on
Training in progress, step 600
e634ca5
kingabzpro
commited on
Training in progress, step 500
cb6ace4
kingabzpro
commited on
Training in progress, step 400
b24c8ad
kingabzpro
commited on
Training in progress, step 300
17ea26e
kingabzpro
commited on
Training in progress, step 200
4922f4d
kingabzpro
commited on
Training in progress, step 100
3365ce4
kingabzpro
commited on
add tokenizer
f3d6ccf
kingabzpro
commited on
Training in progress, step 150
6c33974
kingabzpro
commited on
Training in progress, step 100
1ed5429
kingabzpro
commited on
Training in progress, step 50
c41948f
kingabzpro
commited on
add tokenizer
8b60104
kingabzpro
commited on