CTranslate2 int8 version of L3-8B-Stheno-v3.1
This is a int8_bfloat16 quantization of L3-8B-Stheno-v3.1
See more on CTranslate2: Docs | Github
This model was converted to ct2 format using the following commnd:
ct2-transformers-converter --model Sao10K/L3-8B-Stheno-v3.1 --output_dir L3-8B-Stheno-v3.1-ct2 --quantization int8_bfloat16 --low_cpu_mem_usage
no converstion needed using the model from this repository as it is already in ct2 format.
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the HF Inference API does not support CTranslate2 models with pipeline type text-generation
Model tree for Anthonyg5005/L3-8B-Stheno-v3.1-int8-ct2
Base model
Sao10K/L3-8B-Stheno-v3.1