Aixr / README.md
Meforgers's picture
Update README.md
9feb919 verified
|
raw
history blame
1.26 kB
metadata
tags:
  - Generative AI
  - text-generation-inference
  - text-generation
  - peft
library_name: transformers
license: apache-2.0
language:
  - tr
  - en
  - es

Model Trained By Meforgers

This model was trained by Meforgers for the futuristic projects.

  • Firstly

  • Need to install packages

  • Usage

    from unsloth import FastLanguageModel
    import torch
    
    # Variable side
    max_seq_length = 512 
    dtype = torch.float16 
    load_in_4bit = True
    
    # Alpaca prompt
    alpaca_prompt = """### Instruction:
    {0}
    
    ### Input:
    {1}
    
    ### Response:
    {2}
    """
    
    model, tokenizer = FastLanguageModel.from_pretrained(
        model_name="Meforgers/Aixr",
        max_seq_length=max_seq_length,
        dtype=dtype,
        load_in_4bit=load_in_4bit,
    )
    
    FastLanguageModel.for_inference(model)
    
    inputs = tokenizer(
        [
            alpaca_prompt.format(
                "Can u text me basic python code?",  # instruction side (You need to change that side)
                "",  # input
                "",  # output - leave this blank for generation!
            )
        ],
        return_tensors="pt"
    ).to("cuda")
    
    outputs = model.generate(**inputs, max_new_tokens=128, use_cache=True)
    print(tokenizer.batch_decode(outputs))