Instructions to use sesame/csm-1b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use sesame/csm-1b with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-to-speech", model="sesame/csm-1b")# Load model directly from transformers import AutoFeatureExtractor, AutoModelForTextToWaveform extractor = AutoFeatureExtractor.from_pretrained("sesame/csm-1b") model = AutoModelForTextToWaveform.from_pretrained("sesame/csm-1b") - Notebooks
- Google Colab
- Kaggle
π© Report: Illegal or restricted content
#34
by SaiKamal007 - opened
incorrect license terms. Its uses Community license product as its backbone which is Llama and calls itself open source which is misleading.
Meta cannot license the architecture llama that's just a name for a form factor that is now an aggregation of logic from all over the open source and is itself changing over time.
Their license maybe applies to the pre trained weights of their released models - but even then, probably not in any enforceable way.