nkwbtb's picture
Update README.md
5b439c7 verified
|
raw
history blame
2.2 kB
metadata
library_name: transformers
tags:
  - factual
  - consistency
  - hallucination
license: apache-2.0
datasets:
  - nkwbtb/SummaCoz
language:
  - en
base_model:
  - nkwbtb/flan-t5-xxl-bf16
pipeline_tag: text2text-generation

SummaCoz LoRA Adapter for flan-t5-xxl

The model provides summarization factual consistency classificaiton and explanations.

Model Details

Model Usage

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline

tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-xxl")
model = AutoModelForSeq2SeqLM.from_pretrained("nkwbtb/flan-t5-11b-SummaCoz",
                                              torch_dtype="auto",
                                              device_map="auto")
pipe = pipeline("text2text-generation", 
                model=model, 
                tokenizer=tokenizer)

PROMPT = """Is the hypothesis true based on the premise? Give your explanation afterwards.

Premise: 
{article}

Hypothesis:
{summary}
"""

article = "Goldfish are being caught weighing up to 2kg and koi carp up to 8kg and one metre in length."
summary = "Goldfish are being caught weighing up to 8kg and one metre in length."

print(pipe(PROMPT.format(article=article, summary=summary), 
           do_sample=False, 
           max_new_tokens=512))
"""[{'generated_text': '\
No, the hypothesis is not true. \
- The hypothesis states that goldfish are being caught weighing up to 8kg and one metre in length. \
- However, the premise states that goldfish are being caught weighing up to 2kg and koi carp up to 8kg and one metre in length. \
- The difference between the two is that the koi carp is weighing 8kg and the goldfish is weighing 2kg.'}]"""

Citation [optional]

BibTeX:

[More Information Needed]