![image/png](https://cdn-uploads.huggingface.co/production/uploads/65ba55ce95228fc8439c9bd0/qraOMjTwWXjB0nQeIOJlp.png) --- license: apache-2.0 --- Poro-34B's Lora fine-tuning with publicly available S-group data. The data set contains 185 questions and answers. The training was done on Amazon Sagemaker, but it works on any other public cloud. More detailed instructions can be found here https://medium.com/@timo.au.laine/poro-34bs-lora-fine-tuning-with-publicly-available-s-group-data-70af60e0b21c ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65ba55ce95228fc8439c9bd0/VgK3MdI1jsAvQGgXFSaW2.png)