Model Details
Model Description
- Using shenzhi-wang/Mistral-7B-v0.3-Chinese-Chat as base model, and finetune the dataset as mentioned via unsloth. Makes the model uncensored.
Training Code
Training Procedure Raw Files
ALL the procedure are training on Vast.ai
Hardware in Vast.ai:
GPU: 1x A100 SXM4 80GB
CPU: AMD EPYC 7513 32-Core Processor
RAM: 129 GB
Disk Space To Allocate:>150GB
Docker Image: pytorch/pytorch:2.2.0-cuda12.1-cudnn8-devel
Download the ipynb file.
Training Data
Base Model
Dataset
Usage
from transformers import pipeline
qa_model = pipeline("question-answering", model='stephenlzc/Mistral-7B-v0.3-Chinese-Chat-uncensored')
question = "How to make girlfreind laugh? please answer in Chinese."
qa_model(question = question)
- Downloads last month
- 384
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for stephenlzc/Mistral-7B-v0.3-Chinese-Chat-uncensored
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3