Crazy Reasoning Qwen2.5 7B 🧠

A funnny reasoning model, trained on a manual high-quality dataset! Don't use it in real problems.

He has "Secret Thoughts", after which he gives out an "Answer".

Officially supports English, Russian, Chinese, French, and Spanish (it has been in training dataset), but you can try other!

  • Context Length: 32,768.
  • Dataset: Custom without syntetic.
  • Prompt Example: "Hello!".
  • System Prompt: Set your language and other instructions for better perfomance, you can!

This model is best used for entertainment purposes (it will be funny if it actually works better in other things too). Author - FGOTYT (me).

Downloads last month
463
GGUF
Model size
7.62B params
Architecture
qwen2

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .

Model tree for FGOTYT/Crazy_Reasoning_Qwen2.5_7B

Base model

Qwen/Qwen2.5-7B
Quantized
(122)
this model

Collection including FGOTYT/Crazy_Reasoning_Qwen2.5_7B