File size: 817 Bytes
1a9b36a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
845046c
 
 
 
 
 
 
 
 
 
 
 
747b74b
845046c
 
 
 
 
1a9b36a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
base_model: unsloth/mistral-7b-bnb-4bit
---

# Uploaded  model

- **Developed by:** priamai
- **License:** apache-2.0
- **Finetuned from model :** unsloth/mistral-7b-bnb-4bit

This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.

Num GPUs = 1

Num Epochs = 2

Batch size per device = 2

Gradient Accumulation steps = 2

Total batch size = 8

otal steps = 2

Number of trainable parameters = 41,943,040

Total Samples = 800

Source of reports: [ORKL](https://orkl.eu/)

[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)