custom_gpt2 / README.md
mmoffatt's picture
Upload config
f8c8d57 verified
|
raw
history blame
530 Bytes
metadata
language:
  - en
license: mit
datasets:
  - Salesforce/wikitext

This is a custom implementation of gpt2, where we replace attention with our implementation. Currently, we don't replace softmax, but in future submits we would like to replace the softmax function in attention with other softmax variations.

We directly use the huggingface gpt2 model: https://huggingface.co/openai-community/gpt2

This model was finetuned on the wikitext dataset: https://paperswithcode.com/dataset/wikitext-2

base model: huggingface gpt2