Incorrect intermediate_size
#1
by
CISCai
- opened
This fixes the model for llama.cpp
at least, untested on transformers
.
Field is correct, conversion script needs to handle moe_intermediate_size
and shared_expert_intermediate_size
, see ggerganov/llama.cpp#7816(comment).
CISCai
changed pull request status to
closed