This is a model of the google Gemma-7B model with the parameter size increased to 14B. The attention head has been doubled and the number of hidden layers has been increased to 42.
Chat template
system: system message... B: user message... A: assistant message...