RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead.
#19 opened 10 months ago
by
zayuki
The model 'RWGPTQForCausalLM' is not supported for text-generation.
#18 opened 12 months ago
by
herMaster
Model not working for CPU
#17 opened about 1 year ago
by
vivek0797
ValueError: Unrecognized configuration class
1
#14 opened over 1 year ago
by
hfgdfdsd
Can't use with tgi. Getting `RuntimeError: weight transformer.h.0.self_attention.query_key_value.weight does not exist`
1
#12 opened over 1 year ago
by
mpronesti
Integration to transformers pipeline
5
#10 opened over 1 year ago
by
clementdesroches
Custom 4-bit Finetuning 5-7 times faster inference than QLora
2
#5 opened over 1 year ago
by
rmihaylov
Getting 0 tokens while running using text-generation -webui
6
#4 opened over 1 year ago
by
avatar8875
CUDA extension not installed
3
#3 opened over 1 year ago
by
kllisre
Do you know anything about this error?
5
#2 opened over 1 year ago
by
RedXeol