Distil GPT
This is a small version of Generative Pre-trained Transformer 2 (GPT-2), pretrained on 10 GB of Pakistan's Legal Corpus to generate Legal text developed by AI Systems using Causal Language Modelling.
Reference:
This model was orginally taken from "distilGPT2" developed by HuggingFace (https://huggingface.co/distilgpt2)
- Downloads last month
- 13
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.