Self trained microscopit Mistral. Around 810M parameters.

The tokenizer is the one from https://huggingface.co/mistralai/Mistral-7B-v0.1.

It is being trained on around 400B tokens and this is step 120k.

The evaluation is being conducted now.

License

This model is available under the Apache 2.0 License.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
3
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for DrNicefellow/Microscopic-Mistral-120k-steps

Quantizations
1 model

Collection including DrNicefellow/Microscopic-Mistral-120k-steps