Palmyra-X-4.3-73B / README.md
kiranr's picture
Upload folder using huggingface_hub
b8f8153 verified
metadata
base_model:
  - Writer/Palmyra-X-004
model-index:
  - name: Palmyra-X-4.3
    results: []
license: other
license_name: writer-open-model-license
license_link: https://writer.com/legal/open-model-license/
extra_gated_prompt: >-
  By clicking "Agree", you agree to the [License
  Agreement](https://writer.com/legal/open-model-license/) and acknowledge
  Writer's [Privacy Policy](https://writer.com/legal/acceptable-use/).
extra_gated_fields:
  Name: text
  Email: text
  Organization or Affiliation: text
  Receive email updates and promotions on Writer products, services, and research?:
    type: select
    options:
      - 'Yes'
      - 'No'
  I acknowledge that this model is for non-commercial use only unless I acquire a separate license from Writer: checkbox
language:
  - en

Palmyra-X-4.3

Use with transformers

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

model_id = "Writer/Palmyra-X-4.3-73B"

tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)

model = AutoModelForCausalLM.from_pretrained(
    model_id,
    torch_dtype=torch.float16,
    device_map="auto",
    attn_implementation="flash_attention_2",
    trust_remote_code=True,
)

messages = [
    {
        "role": "user",
        "content": "Write a blog post about strangelets",
    },
]

input_ids = tokenizer.apply_chat_template(
    messages, tokenize=True, add_generation_prompt=True, return_tensors="pt"
)

gen_conf = {
    "max_new_tokens": 256,
    "eos_token_id": tokenizer.eos_token_id,
    "temperature": 0.7,
    "top_p": 0.9,
}

with torch.inference_mode():
    output_id = model.generate(input_ids, **gen_conf)

output_text = tokenizer.decode(output_id[0][input_ids.shape[1] :])

print(output_text)