Text Generation
Transformers
PyTorch
Safetensors
English
stripedhyena
custom_code
EugeneLYC
add the config files and code
230c4b6
|
raw
history blame
630 Bytes
metadata
license: apache-2.0
language:
  - en

StripedHyena-Hessian-7B (SH-7B)

Model Architecture

StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in Hyena blocks, different from traditional decoder-only Transformers.

  • Costant memory decoding in Hyena blocks via representation of convolutions as state-space models (modal or canonical form), or as truncated filters.
  • Lower latency to preprocess long prompts.
  • Improvements to training and inference compute-optimal scaling laws, compared to Transformers.