YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
The models have been hacked together because their base weights share a similar architecture. But for now using the Pythia inference code only gibberish is generated, while when trying to use the MPT based inference code, i am running into errors that stop it from working.
Currently trying to adapt the "MPT-7b Storywriter 65k" based inference code to work with this new model merge. I'd appreciate tips if anyone tries their hand at it.
This model is not functional as is.
- Downloads last month
- 18
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.