YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Trained on 100k dumped messages from the 'chan' todd proxy. I could not dedupe the dataset but it has had serious effect on the llama7b I used. Calls me master a whole bunch more now.
Content isn't SFW so be aware. Trained in 4-bit for 3 epochs, I think it overfit and really needed just 2.
Tested in 4-bit and FP16 on plain HF llama-7b, maybe it works on derivative models of the same beaks.
V2 version was trained at a higher rank and logner context (512) on only unique data with ALLMs and "content warning" statements removed. It is much stronger.
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.