̶F̶u̶l̶l̶ ̶m̶o̶d̶e̶l̶ ̶c̶a̶r̶d̶ ̶s̶o̶o̶n̶.̶ ̶E̶a̶r̶l̶y̶ ̶r̶e̶l̶e̶a̶s̶e̶;̶
Spherical Hexa-Merge of hand-picked Mistrel-7B models.
This is the successor to Naberius-7B, building on its findings.
[11 Dec 2023 UPDATE] Original compute resource for experiment are inaccessible. Long story;
https://huggingface.co/CalderaAI/Hexoteric-7B/discussions/2#6576d3e5412ee701851fd567
Stanford Alpaca format works best for instruct test driving this engima.
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.