Enable flash_attention_2 support since the underlying Mistral model supports it

#3
No description provided.
lievan changed pull request status to merged

Sign up or log in to comment