{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "o6RxqIirisTj" }, "source": [ "To run this, press \"Runtime\" and press \"Run all\" on a **free** Tesla T4 Google Colab instance!\n", "
\n", "\n", "To install Unsloth on your own computer, follow the installation instructions on our Github page [here](https://github.com/unslothai/unsloth#installation-instructions---conda).\n", "\n", "You will learn how to do [DPO data prep](#Data), and how to [train via `DPOTrainer`](#Train).\n", "To learn more about DPO, read TRL's [blog post](https://huggingface.co/blog/dpo-trl). We follow [Huggingface's Alignment Handbook](https://github.com/huggingface/alignment-handbook) to replicate [Zephyr](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta)." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "2eSvM9zX_2d3" }, "outputs": [], "source": [ "%%capture\n", "import torch\n", "major_version, minor_version = torch.cuda.get_device_capability()\n", "if major_version >= 8:\n", " # Use this for new GPUs like Ampere, Hopper GPUs (RTX 30xx, RTX 40xx, A100, H100, L40)\n", " !pip install \"unsloth[colab_ampere] @ git+https://github.com/unslothai/unsloth.git\"\n", "else:\n", " # Use this for older GPUs (V100, Tesla T4, RTX 20xx)\n", " !pip install \"unsloth[colab] @ git+https://github.com/unslothai/unsloth.git\"\n", "pass\n", "\n", "!pip install \"git+https://github.com/huggingface/transformers.git\" # Native 4bit loading works!" ] }, { "cell_type": "markdown", "metadata": { "id": "QYds3fcii6gC" }, "source": [ "* We support Llama, Mistral, CodeLlama, TinyLlama, Vicuna, Open Hermes etc\n", "* And Yi, Qwen ([llamafied](https://huggingface.co/models?sort=trending&search=qwen+llama)), Deepseek, all Llama, Mistral derived archs.\n", "* We support 16bit LoRA or 4bit QLoRA. Both 2x faster.\n", "* `max_seq_length` can be set to anything, since we do automatic RoPE Scaling via [kaiokendev's](https://kaiokendev.github.io/til) method.\n", "* [**NEW**] With [PR 26037](https://github.com/huggingface/transformers/pull/26037), we support downloading 4bit models **4x faster**! [Our repo](https://huggingface.co/unsloth) has Llama, Mistral 4bit models.\n", "* DPO requires a model already trained by SFT on a similar dataset that is used for DPO. We use `HuggingFaceH4/mistral-7b-sft-beta` as the SFT model. Use this [notebook](https://colab.research.google.com/drive/1Dyauq4kTZoLewQ1cApceUQVNcnnNTzg_?usp=sharing) first to train a SFT model." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "E8-BWi7MzkRz", "outputId": "62b1cc0c-a494-4816-aff8-d2cf340a7691" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/usr/local/lib/python3.10/dist-packages/unsloth/__init__.py:67: UserWarning: CUDA is not linked properly.\n", "We shall run `ldconfig /usr/lib64-nvidia` to try to fix it.\n", " warnings.warn(\n" ] } ], "source": [ "# One must patch the DPO Trainer first!\n", "from unsloth import PatchDPOTrainer\n", "PatchDPOTrainer()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 385, "referenced_widgets": [ "e874eb7fb86d43a5b77a5dfacfa9d8a0", "da4e510666be477c841212f5017b47e0", "25a23224fd504c48bc4a20c6b47acb10", "8cf44fb3cbb745bea86d61189c5665a0", "08eeb449819d47409de21b90da03c70d", "6283d5ae17b544e8b49de72af49c171f", "78c999dc45a44436975ae35a6ba602fc", "c41bb81611974c11aa5bc31d6e649c65", "e2c7f0b1f20b4bda8269dcd30edcf4ac", "768a9a1adf9843c8afadc861016e9dd7", "9dfc6867aba4448fa534b8f74d2da3a1", "9e6b808ad9d243c69b099fffef673dec", "5e26903f37f44ccea54b031989bdee07", "78c354a390c54b0a990f052db95a70e3", "38e738f0e66c46aba21dba8f7eeed727", "ac130598ebcc4ae2a426b62a011775e8", "6b4d1d950cb14c1484ebede77c610e75", "b29322681da7490291118b138ebf28ba", "5dcc7b2a47e6495297d36855724940ed", "2c8c2043ab4c48e997c498fcb3169676", "158c8e46a9b34a7a89154ef484187eb2", "5aba62774bae4e15821623ac71700b70", "5ca45f817d784a8aa6c89969136ee53d", "f3b6a130e10e416ea8b95b80c264028d", "00273f529fc4482b83d90892421b2f38", "23888a5e3a9441f09f6e96943587c92f", "9176ce2d816c4faeb792aa2e95d149ff", "0b48053354ed49788897a6fff135cef6", "ebb8f01d7e834380ada8008f296207c3", "95abfe070a1042ceb459ddd9a719b4ed", "41a7293a45174bf69178d0e88aeb9f89", "42628252c650442d9824b684d8fac22f", "17ef306e2a27469da9086652b3acea14", "ecec31f77eab469aa61ead705557cdcb", "ee4c1dcd246c4818bf60bc6164bfaf3f", "db41489806bc426aa2259070d036e87f", "abf877d764b148d1a539be05a5ffa61a", "79fbf74966144b89b58c53329cf9bc55", "a2f1f703a0994bbe949948de631e8f9e", "aebf71c536714962827e11f17c6a6e16", "bda940f600d34640bb781db24841f944", "4a5a796382234e00a701ac3dcb5009ef", "e6cd6cca3a3e4378b8c28cb2c201ce65", "058a920f8b23400896363b9f799e5428", "abf769643beb4576be0ec06b40c3a39c", "d49c4bbd79764be6a367a529430bb21c", "66fda39ee94640f9bfbf95b277029f5f", "a91b988eba704f3d83b2427008360214", "c6984f94ef2946f8bf97184bb3f6569c", "e8402dfc1a8b41429e8015d0569fd3d2", "3d6b10bc206a4735b36e64d25ef8045e", "33585dca549944f3bae76a720aa2d0e2", "749f2a828c024012bc404f62b3110317", "2a5c7b2997744dafad3cfdd16fe931cf", "938ae07a3b014287a6b0f9d903c61b6b", "e29361925b634f968acb57d488fc849d", "86b2a0cba0f1497897d0cc8a4279cbfb", "cac455d66c424233bfcd8aef411d1180", "fb6a3fcef0e84843bc33bc9b929afbf1", "66859785d81d455e91032048226a01d4", "6e99c507e18d4d23adca6e9a23cf0004", "cdf03b41c0f94373a99fe372a5f13dc3", "eb797a1a19db4e898634d18f19b2056d", "fab1a0ae36644433b756857d36dd52f2", "928e0ff3f174436b817262603cc78b77", "7234655b163a4c3c80c28388a1f072fa", "788d1b6eb4904b94bcfd9e0929720d1f", "5b8655d89c63464c968da114b0077829", "7595ab61957945e192a988916b0fcec9", "5ad12f2b58204d3f8f77d12a4a2c04ec", "ed1bcf79e7f84a499f41b36cbe4f3a69", "319a5cd1e1f144c7b090eb78bc72a969", "71a02ae95f6745bcb68e25291a59cb4d", "9f25fb5cf7de47de8b6dd8c9213ec9e3", "ec2ab7b05c014f3bac6b15c12bde34d8", "ba337b31b13347b9be6e1a43fd0cd8aa", "2995caed8cd7422b9c3fbeacd11839d8" ] }, "id": "QmUBVEnvCDJv", "outputId": "0680b8c6-0dcb-447a-b5ce-557cb1b6a85b" }, "outputs": [ { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "e874eb7fb86d43a5b77a5dfacfa9d8a0", "version_major": 2, "version_minor": 0 }, "text/plain": [ "config.json: 0%| | 0.00/1.04k [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stderr", "output_type": "stream", "text": [ "==((====))== Unsloth: Fast Mistral patching release 2024.1\n", " \\\\ /| GPU: Tesla T4. Max memory: 14.748 GB\n", "O^O/ \\_/ \\ CUDA capability = 7.5. Xformers = 0.0.22.post7. FA = False.\n", "\\ / Pytorch version: 2.1.0+cu121. CUDA Toolkit = 12.1\n", " \"-____-\" bfloat16 = FALSE. Platform = Linux\n", "\n", "You passed `quantization_config` to `from_pretrained` but the model you're loading already has a `quantization_config` attribute. The `quantization_config` attribute will be overwritten with the one you passed to `from_pretrained`.\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "9e6b808ad9d243c69b099fffef673dec", "version_major": 2, "version_minor": 0 }, "text/plain": [ "model.safetensors: 0%| | 0.00/4.13G [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "5ca45f817d784a8aa6c89969136ee53d", "version_major": 2, "version_minor": 0 }, "text/plain": [ "generation_config.json: 0%| | 0.00/116 [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ecec31f77eab469aa61ead705557cdcb", "version_major": 2, "version_minor": 0 }, "text/plain": [ "tokenizer_config.json: 0%| | 0.00/1.48k [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "abf769643beb4576be0ec06b40c3a39c", "version_major": 2, "version_minor": 0 }, "text/plain": [ "tokenizer.model: 0%| | 0.00/493k [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "e29361925b634f968acb57d488fc849d", "version_major": 2, "version_minor": 0 }, "text/plain": [ "tokenizer.json: 0%| | 0.00/1.80M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "788d1b6eb4904b94bcfd9e0929720d1f", "version_major": 2, "version_minor": 0 }, "text/plain": [ "special_tokens_map.json: 0%| | 0.00/624 [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from unsloth import FastLanguageModel\n", "import torch\n", "max_seq_length = 4096 # Choose any! We auto support RoPE Scaling internally!\n", "dtype = None # None for auto detection. Float16 for Tesla T4, V100, Bfloat16 for Ampere+\n", "load_in_4bit = True # Use 4bit quantization to reduce memory usage. Can be False.\n", "\n", "model, tokenizer = FastLanguageModel.from_pretrained(\n", " model_name = \"unsloth/zephyr-sft-bnb-4bit\",\n", " max_seq_length = max_seq_length,\n", " dtype = dtype,\n", " load_in_4bit = load_in_4bit,\n", " # token = \"hf_...\", # use one if using gated models like meta-llama/Llama-2-7b-hf\n", ")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "cellView": "form", "id": "AqkY_wHdKyOl" }, "outputs": [], "source": [ "#@title Alignment Handbook utils\n", "import os\n", "import re\n", "from typing import List, Literal, Optional\n", "\n", "from datasets import DatasetDict, concatenate_datasets, load_dataset, load_from_disk\n", "from datasets.builder import DatasetGenerationError\n", "\n", "\n", "DEFAULT_CHAT_TEMPLATE = \"{% for message in messages %}\\n{% if message['role'] == 'user' %}\\n{{ '<|user|>\\n' + message['content'] + eos_token }}\\n{% elif message['role'] == 'system' %}\\n{{ '<|system|>\\n' + message['content'] + eos_token }}\\n{% elif message['role'] == 'assistant' %}\\n{{ '<|assistant|>\\n' + message['content'] + eos_token }}\\n{% endif %}\\n{% if loop.last and add_generation_prompt %}\\n{{ '<|assistant|>' }}\\n{% endif %}\\n{% endfor %}\"\n", "\n", "\n", "def apply_chat_template(\n", " example, tokenizer, task: Literal[\"sft\", \"generation\", \"rm\", \"dpo\"] = \"sft\", assistant_prefix=\"<|assistant|>\\n\"\n", "):\n", " def _strip_prefix(s, pattern):\n", " # Use re.escape to escape any special characters in the pattern\n", " return re.sub(f\"^{re.escape(pattern)}\", \"\", s)\n", "\n", " if task in [\"sft\", \"generation\"]:\n", " messages = example[\"messages\"]\n", " # We add an empty system message if there is none\n", " if messages[0][\"role\"] != \"system\":\n", " messages.insert(0, {\"role\": \"system\", \"content\": \"\"})\n", " example[\"text\"] = tokenizer.apply_chat_template(\n", " messages, tokenize=False, add_generation_prompt=True if task == \"generation\" else False\n", " )\n", " elif task == \"rm\":\n", " if all(k in example.keys() for k in (\"chosen\", \"rejected\")):\n", " chosen_messages = example[\"chosen\"]\n", " rejected_messages = example[\"rejected\"]\n", " # We add an empty system message if there is none\n", " if chosen_messages[0][\"role\"] != \"system\":\n", " chosen_messages.insert(0, {\"role\": \"system\", \"content\": \"\"})\n", " if rejected_messages[0][\"role\"] != \"system\":\n", " rejected_messages.insert(0, {\"role\": \"system\", \"content\": \"\"})\n", " example[\"text_chosen\"] = tokenizer.apply_chat_template(chosen_messages, tokenize=False)\n", " example[\"text_rejected\"] = tokenizer.apply_chat_template(rejected_messages, tokenize=False)\n", " else:\n", " raise ValueError(\n", " f\"Could not format example as dialogue for `rm` task! Require `[chosen, rejected]` keys but found {list(example.keys())}\"\n", " )\n", " elif task == \"dpo\":\n", " if all(k in example.keys() for k in (\"chosen\", \"rejected\")):\n", " # Compared to reward modeling, we filter out the prompt, so the text is everything after the last assistant token\n", " prompt_messages = [[msg for msg in example[\"chosen\"] if msg[\"role\"] == \"user\"][0]]\n", " # Insert system message\n", " if example[\"chosen\"][0][\"role\"] != \"system\":\n", " prompt_messages.insert(0, {\"role\": \"system\", \"content\": \"\"})\n", " else:\n", " prompt_messages.insert(0, example[\"chosen\"][0])\n", " # TODO: handle case where chosen/rejected also have system messages\n", " chosen_messages = example[\"chosen\"][1:]\n", " rejected_messages = example[\"rejected\"][1:]\n", " example[\"text_chosen\"] = tokenizer.apply_chat_template(chosen_messages, tokenize=False)\n", " example[\"text_rejected\"] = tokenizer.apply_chat_template(rejected_messages, tokenize=False)\n", " example[\"text_prompt\"] = tokenizer.apply_chat_template(\n", " prompt_messages, tokenize=False, add_generation_prompt=True\n", " )\n", " example[\"text_chosen\"] = _strip_prefix(example[\"text_chosen\"], assistant_prefix)\n", " example[\"text_rejected\"] = _strip_prefix(example[\"text_rejected\"], assistant_prefix)\n", " else:\n", " raise ValueError(\n", " f\"Could not format example as dialogue for `dpo` task! Require `[chosen, rejected]` keys but found {list(example.keys())}\"\n", " )\n", " else:\n", " raise ValueError(\n", " f\"Task {task} not supported, please ensure that the provided task is one of {['sft', 'generation', 'rm', 'dpo']}\"\n", " )\n", " return example\n", "\n", "\n", "def get_datasets(\n", " data_config: dict,\n", " splits: List[str] = [\"train\", \"test\"],\n", " shuffle: bool = True,\n", ") -> DatasetDict:\n", " \"\"\"\n", " Loads one or more datasets with varying training set proportions.\n", "\n", " Args:\n", " data_config (`DataArguments` or `dict`):\n", " Dataset configuration and split proportions.\n", " splits (`List[str]`, *optional*, defaults to `['train', 'test']`):\n", " Dataset splits to load and mix. Assumes the splits exist in all datasets and have a `train_` or `test_` prefix.\n", " shuffle (`bool`, *optional*, defaults to `True`):\n", " Whether to shuffle the training and testing/validation data.\n", "\n", " Returns\n", " [`DatasetDict`]: The dataset dictionary containing the loaded datasets.\n", " \"\"\"\n", "\n", " if type(data_config) is dict:\n", " # Structure of the input is:\n", " # dataset_mixer = {\n", " # \"dataset1\": 0.5,\n", " # \"dataset1\": 0.3,\n", " # \"dataset1\": 0.2,\n", " # }\n", " dataset_mixer = data_config\n", " else:\n", " raise ValueError(f\"Data config {data_config} not recognized.\")\n", "\n", " raw_datasets = mix_datasets(dataset_mixer, splits=splits, shuffle=shuffle)\n", " return raw_datasets\n", "\n", "\n", "def mix_datasets(dataset_mixer: dict, splits: Optional[List[str]] = None, shuffle=True) -> DatasetDict:\n", " \"\"\"\n", " Loads and mixes datasets according to proportions specified in `dataset_mixer`.\n", "\n", " Args:\n", " dataset_mixer (`dict`):\n", " Dictionary containing the dataset names and their training proportions. By default, all test proportions are 1.\n", " splits (Optional[List[str]], *optional*, defaults to `None`):\n", " Dataset splits to load and mix. Assumes the splits exist in all datasets and have a `train_` or `test_` prefix.\n", " shuffle (`bool`, *optional*, defaults to `True`):\n", " Whether to shuffle the training and testing/validation data.\n", " \"\"\"\n", " raw_datasets = DatasetDict()\n", " raw_train_datasets = []\n", " raw_val_datasets = []\n", " fracs = []\n", " for ds, frac in dataset_mixer.items():\n", " fracs.append(frac)\n", " for split in splits:\n", " try:\n", " # Try first if dataset on a Hub repo\n", " dataset = load_dataset(ds, split=split)\n", " except DatasetGenerationError:\n", " # If not, check local dataset\n", " dataset = load_from_disk(os.path.join(ds, split))\n", "\n", " if \"train\" in split:\n", " raw_train_datasets.append(dataset)\n", " elif \"test\" in split:\n", " raw_val_datasets.append(dataset)\n", " else:\n", " raise ValueError(f\"Split type {split} not recognized as one of test or train.\")\n", "\n", " if any(frac < 0 for frac in fracs):\n", " raise ValueError(\"Dataset fractions cannot be negative.\")\n", "\n", " if len(raw_train_datasets) > 0:\n", " train_subsets = []\n", " for dataset, frac in zip(raw_train_datasets, fracs):\n", " train_subset = dataset.select(range(int(frac * len(dataset))))\n", " train_subsets.append(train_subset)\n", " if shuffle:\n", " raw_datasets[\"train\"] = concatenate_datasets(train_subsets).shuffle(seed=42)\n", " else:\n", " raw_datasets[\"train\"] = concatenate_datasets(train_subsets)\n", " # No subsampling for test datasets to enable fair comparison across models\n", " if len(raw_val_datasets) > 0:\n", " if shuffle:\n", " raw_datasets[\"test\"] = concatenate_datasets(raw_val_datasets).shuffle(seed=42)\n", " else:\n", " raw_datasets[\"test\"] = concatenate_datasets(raw_val_datasets)\n", "\n", " if len(raw_datasets) == 0:\n", " raise ValueError(\n", " f\"Dataset {dataset_mixer} not recognized with split {split}. Check the dataset has been correctly formatted.\"\n", " )\n", "\n", " return raw_datasets" ] }, { "cell_type": "markdown", "metadata": { "id": "EQ-Cp2V6kDcr" }, "source": [ "\n", "### Data Prep\n", "We follow Huggingface's [Alignment Handbook](https://github.com/huggingface/alignment-handbook) for [Zephyr](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) and use the [Ultra Feedback dataset](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized), and sample 0.5% of it to speed things up. You can sample the full dataset for a full run." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 553, "referenced_widgets": [ "2b6fc1098a5944b0a24b217ae0afbde4", "6001cd0f400f41139dea8da6143e923c", "1df925fce5c84167b621ff0de7d3f5d9", "c168e892023e4bfa9817bb3e1d0ebd26", "922c78ac68b04f41a1994033d56af782", "0c4cb2aff30144229987978b34c2aca0", "9f30c0b8c55940f28157c09c86834504", "4b2e98b797c648089e72abd871e6b579", "33fa33cdcde342ebb5aab39d1831bf92", "21ae70b774094988b25768eb7a7db9c2", "c2d60fb0bb2c4a39a91b259c8caf88d2", "edca49569f0b45a3bda8327171d93d5d", "51f18d9993214dbea5c99b42cbd816bd", "fb834046447043518bbe944173f104de", "a536ba8737f7444d8ddef16ee2f56698", "ff90b7c5478d40c6a36d6021ab0c5708", "714365febd4b4c778cf3a300fd3ee8bc", "960e12d25c4a4e5c8d2f31ccc4d3d041", "e2f2ff4f89664ac4b52609e7ea90f3fd", "2232df7258c940efa25a0f789dc1721e", "6d9a36489c7a453abf1d90822649dacc", "2dfb2e2e7eb142058d6b0ef81b3dab64", "61d8de524f7a4070a5ae811b7c8cc92d", "d9527ff9d69f4268a2195524060a9dfa", "5f40887b20cd444b8fc646da40898c41", "26ac6197edea4476abc948042a9cb058", "8811335900fb4238b06abd62dffe70d7", "e627ac8f64b04b2493fec7a13c7b6eb6", "77560ba252fa4bf5961e6c8dc99bf88f", "3b6cdbd423554b99b27a59d55ae75e8d", "c6ccd287b7ef4aaabb25fe43d445fa85", "85af4c279f4e4efaa54ecc97bbe72e8e", "ca30d3124745449191e711dadce629f3", "68d33530886d4996ab8840db7329969e", "aaafd0ed672241df9406abe0a478ff1f", "192b08ba99484b38b301426837b0dea9", "888f40d51e044d429d2f363c72e34e1b", "366ec8fa7e0840daa0d6c24ac3f7a745", "3d2ad491df294f1581aedaf1aaf665f9", "7c1e77648fbe4807b7249067a0645658", "4e17bbdc4bdd4b34abc83c141607998e", "7831284a678340ca9ce6794ec0fdc1bc", "0010cc35659944efa5cccc638e0aaa0d", "0a578ffc2d8543c7a49fafbf23aff60c", "ca5724ddc9ad4a529aab73c11ee9c878", "1ee9c355b65e4c059ce870557f675e6c", "420dcbfb1c2846f48ace8a2b06db6aa6", "04ba536ded2e407e9d698df696d819d8", "8c0207d0b50740b1b6d8e14e882ff013", "f374fa0c19554feba8ed2c5c62597895", "173d24c86f43418aa81b4ae38531b40b", "e33403356d724022a5e7ffca38b41561", "381f92fc57834262b022b12db52a484d", "7072458c520643a88e827ddc11d933f7", "69c30e7c1ba5498e82b854f9968887d6", "c20e89c756a74c21b3a27c5f5e031639", "877456268c7d435db6d60946f1516cf3", "bbd8659eefb54a32b0c0d0a6b5bb1888", "be2e5c91a80d426cbf88d3eab41a57ba", "329b73fb10564959821b60dc593f1865", "8f392cfe83f847708eb63e3693fe7e18", "402b006eace54404ab744f913befa409", "2b9a6556e59b4f5486074928693b0d70", "940c965958794b3498415def687b7d56", "3617457ccb2b4b359421691d4ed2297d", "6132e04a763449ed9e3710ede1763746", "d00c194a20ac46d49fb48ba9e4d9d9b9", "46e5ba6c322d4a668df5ea498dc29a86", "2de4a1ad087849b6baed62b7456ebcc0", "f85d5e88f5df4870936c5116faac36f4", "71bb73910ab04ab0848f1fc867f67188", "c64378457ce04fb8a4938a42721319d3", "75c0b780a2b24ccc819591e32f31e8e5", "f79ec936179a43f9986e60a7c8d0465f", "70f0c4edf513401c8cf35e0e99f758e1", "0cf684e01fd241b4a5bc419455e89c28", "dc44909c43f24108ab45f2f8a88ecd06", "ac4aff19efa14bed8eddf6a154a58608", "db2c32aae3a6489499986c9f3e5c38aa", "7bcc31d5fc7841108e802379b3cfa493", "ae64a24183d54f07a9a4ed4af89a6b07", "de58d3b4db0b4c8984066acb3a5125e6", "1c22d65275114536b56032173a2b7930", "4568f636fa474d50b9176afe278a3187", "ed109379d3aa40c0ac671100fa431f50", "c478bfd3a39f4a9a80e2c956394f86a4", "51d70c79c11e4df1aa637ae49e75c72f", "09815bac90324b9a9f042997b25370a0", "aab86efcdef44b67b6ec7ef05e325957", "80860cdb41994af6b0cf01e086fab528", "67f03cbaad374162af1ac190c3d5e328", "51b5d31b3e584723afcbab95c0f14bba", "de083abd1c834a819c786d22d2ca8bd0", "61055caa570f4ba9bc10e4e30579236b", "c48be899ccea47bd97252fc8292d44ea", "ee0cab232f21483791b3092420cc7eeb", "f26bf44381394b4596dacca76b168626", "7e1e8fda631f4736a97b020582ca0805", "1b4901e1c7c748fc9775af562a802130", "1480acd3b54a4a0597c586c844fc8e26", "737ede300f56426c921c40a1006e9be7", "6015c7e27d344181a4b7ab6034276abf", "0319a89ed480439eb7eb3054ea998fd1", "600bbbbc4c09408090bc816a2bbc353e", "0bc85988e0b0491288c284b63b0f7bb3", "e3c1e82cbb8c4eafbb1e7316d5680305", "097a15dc560644a894880d90c0774997", "2b85e06b09694c63b85bce3b6e21b786", "a561436a825d44eda5fb6beab0124098", "43575bbb406a4b14a29c394b266fc877", "ee3bf2f2a06749fe80eeca6a843231be", "b7f58ec86d01413da208e76d1ffbd0fb", "eb05aaa23a7648709a5b2f346bd57a65", "6a0b0edabcbb454eadba52b2dc2064fd", "05e9c12c3f674e3c82fdac439ee28281", "c240c37be08645b98ba3443ab5d81d1a", "ed6c9658c544462bbc2606040de273c0", "ef5e85eea25f4e0a937703f6b66d2e8c", "fd0138e5fe974d4eb9d49e5fa4016afb", "40d7f6cd4297459f9b403f08b5e8f23a", "d6bdd8d56daa4f7081c4c06f56b19b0e", "4e3e75c8c4e74afd89fc277b2666be89", "e92a58712f504c389f4aac8c2cdf88bf", "f17574eef38b4fdea6fed40892cd6005", "4e66401b894f4c008987b3ac548fa68c", "e0357e4c7836402fa578d5cb22fedb62", "bfda236c86ff4719bbd730e86f7048e4", "ce84943f646b4e2c8ae37ec192e6c6d9", "470b235834304ff785ba0c2732fa20e5", "984f0e2432a8496cb5105a1ffcb69fbd", "b2a4a11ed4234ba7a241dff1b08305af", "9ec2db02b8fa480f9cb6af3e27e188ad", "6aedbc6a2ed34d60978f15d23d7652be", "fba6a9e064494e67876e514f9a69e143", "49eb5549fcd4422987a0c4ceb5b4fc87", "b33adaf1c67840529d95e6dd5ad7208f", "6179d85d36f34b51bbac2fc23925a75c", "e4946c0eae6d4d59bff102f785da0fb5", "d4337b226f6b413d9dc5cb5b62303b03", "b21acebd395c4a0dabb14f205cd08bda", "ed93bfd395cd455bbf18b5eedd3a54c1", "c7db2011fb1747138b9ad321d70b890f", "c63db7d7d51c41c7b6738e1db418a3a0", "08f54e8d74bc49d5bb8992ad1db13376", "68ca7850e8324b378fd1920814b286e4", "73fc005626674235b96315fe02f83dcf", "bbea9fee8c404f35b15d6b8f3c36c2b0", "49a49f4612fa42f1a1b2d116da44b0a1", "c04fa685822d47b4968f14911b3f6ba4", "e7807fad01994997bdf00d4ee3ccdfa8", "9ebb83fdbfa948d5972e261dce47a583", "361d8af4dabe4c56aa259cc87179f885", "a6d3136cc29d40f09ea868f2eea16e3f", "c12a6a9fd8a9441f8b36501b08f4fdb5", "ae22d8d7fd234586b71ebe5310a921ed", "0ecd670fe54c4f6fa73a584505262c60", "c703b47443544e1fb65d9bde2a3e95b5", "8803985e095f45a9b0b66b4ab94883bb", "f3d39586bdb04cf9b4332bae94322f45", "db82b1debf674087b8675cc44c40f83a", "7692d345715b4c0282c04044ef5ec9a1", "f4a731b502714d5e87f5a9e8a4c4a260", "7230a076728248fc92bc9ef99ee7f9b4", "9a8b0c94fc6b44bfa90486816410bda7", "8546e894584949129f5c20c13d7d115f" ] }, "id": "r6bUnxe6N3pf", "outputId": "978ac892-91b0-4131-cab9-8a5d99af7edb" }, "outputs": [ { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "2b6fc1098a5944b0a24b217ae0afbde4", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading readme: 0%| | 0.00/5.98k [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "edca49569f0b45a3bda8327171d93d5d", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/222M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "61d8de524f7a4070a5ae811b7c8cc92d", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/3.50M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "68d33530886d4996ab8840db7329969e", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/180M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ca5724ddc9ad4a529aab73c11ee9c878", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/2.84M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "c20e89c756a74c21b3a27c5f5e031639", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/222M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "d00c194a20ac46d49fb48ba9e4d9d9b9", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/7.12M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ac4aff19efa14bed8eddf6a154a58608", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating train_sft split: 0%| | 0/61966 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "aab86efcdef44b67b6ec7ef05e325957", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating test_sft split: 0%| | 0/1000 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "1480acd3b54a4a0597c586c844fc8e26", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating train_gen split: 0%| | 0/61966 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ee3bf2f2a06749fe80eeca6a843231be", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating test_gen split: 0%| | 0/1000 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "4e3e75c8c4e74afd89fc277b2666be89", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating train_prefs split: 0%| | 0/61966 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "6aedbc6a2ed34d60978f15d23d7652be", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating test_prefs split: 0%| | 0/2000 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "08f54e8d74bc49d5bb8992ad1db13376", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Formatting comparisons with prompt template (num_proc=12): 0%| | 0/309 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ae22d8d7fd234586b71ebe5310a921ed", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Formatting comparisons with prompt template (num_proc=12): 0%| | 0/2000 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "raw_datasets = get_datasets(\n", " {\"HuggingFaceH4/ultrafeedback_binarized\" : 0.5}, # 0.5% sampled\n", " splits = [\"train_prefs\", \"test_prefs\"],\n", ")\n", "column_names = list(raw_datasets[\"train\"].features)\n", "\n", "raw_datasets = raw_datasets.map(\n", " apply_chat_template,\n", " fn_kwargs = {\"tokenizer\": tokenizer, \"task\": \"dpo\"},\n", " num_proc = 12,\n", " remove_columns = column_names,\n", " desc = \"Formatting comparisons with prompt template\",\n", ")\n", "\n", "# Replace column names with what TRL needs, text_chosen -> chosen and text_rejected -> rejected\n", "for split in [\"train\", \"test\"]:\n", " raw_datasets[split] = raw_datasets[split].rename_columns(\n", " {\"text_prompt\": \"prompt\", \"text_chosen\": \"chosen\", \"text_rejected\": \"rejected\"}\n", " )" ] }, { "cell_type": "markdown", "metadata": { "id": "7AxUmeAGkjDd" }, "source": [ "We shall print a random item from the dataset" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "oF63zQqNlNJC", "outputId": "928a60d3-1b8f-4b14-a32c-2675d46984e1" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "('<|system|>\\n'\n", " '\\n'\n", " '<|user|>\\n'\n", " 'List two natural resources which was made in the factory.\\n'\n", " '<|assistant|>\\n')\n", "('Natural resources are not made in factories. Natural resources are materials '\n", " 'and substances that occur naturally on Earth, such as water, minerals, '\n", " 'forests, and fossil fuels. Factories typically produce man-made materials or '\n", " 'process natural resources into finished products.\\n')\n", "(\"I'm sorry, but it seems there might be some confusion in your question as \"\n", " 'natural resources are typically sourced from the earth or sea, and not made '\n", " 'in a factory. However, factories often use natural resources to create '\n", " 'various products. Two examples of natural resources that factories may use '\n", " 'are crude oil and iron ore. Crude oil is refined to produce various '\n", " 'petroleum products, such as gasoline and plastics, while iron ore is refined '\n", " 'to create steel, which is used in the construction industry, vehicle '\n", " 'manufacturing, and more. Does this help clarify things?\\n')\n" ] } ], "source": [ "import pprint\n", "row = raw_datasets[\"train\"][8]\n", "pprint.pprint(row[\"prompt\"])\n", "pprint.pprint(row[\"chosen\"])\n", "pprint.pprint(row[\"rejected\"])" ] }, { "cell_type": "markdown", "metadata": { "id": "86wyNoeMj-Ph" }, "source": [ "We now add LoRA adapters so we only need to update 1 to 10% of all parameters!" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "6bZsfBuZDeCL", "outputId": "45c7035e-f016-46ab-9ad4-b0ac389fcc40" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Unsloth 2024.1 patched 32 layers with 32 QKV layers, 32 O layers and 32 MLP layers.\n" ] } ], "source": [ "model = FastLanguageModel.get_peft_model(\n", " model,\n", " r = 64, # Choose any number > 0 ! Suggested 8, 16, 32, 64, 128\n", " target_modules = [\"q_proj\", \"k_proj\", \"v_proj\", \"o_proj\",\n", " \"gate_proj\", \"up_proj\", \"down_proj\",],\n", " lora_alpha = 64,\n", " lora_dropout = 0, # Currently only supports dropout = 0\n", " bias = \"none\", # Currently only supports bias = \"none\"\n", " use_gradient_checkpointing = True,\n", " random_state = 3407,\n", " max_seq_length = max_seq_length,\n", ")" ] }, { "cell_type": "markdown", "metadata": { "id": "-kyd_iyz7DUM" }, "source": [ "\n", "### Train the DPO model\n", "Now let's use Huggingface TRL's `DPOTrainer`! More docs here: [TRL DPO docs](https://huggingface.co/docs/trl/dpo_trainer). We do 3 epochs on 0.5% of the dataset to speed things up." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "v-2BFpDWzo1K" }, "outputs": [], "source": [ "# One must patch the DPO Trainer first!\n", "from unsloth import PatchDPOTrainer\n", "PatchDPOTrainer()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 104, "referenced_widgets": [ "f29440fa274c43809ce9b008f8a61ed9", "b5ed6177daa74fb5ab8e28a050d5fcfd", "2c44b6297394458f8c56ee66e97cbd56", "cfbbbeb2d66d4e80adb96bf3f8be2ff7", "64d0811807b74a9c99d09390612287de", "7af647b06e3c4ea69a3b239098fd64cf", "4ce289d0f23449dab2dc5455cac6182e", "9bdb645f28cf472cb00e5ba139f7347a", "baed51e023c24d40b4d463894cc6a067", "a049050103b24a3bb0ff201c87f42fc4", "59ee90a791574570bd1c1327d54e30f1" ] }, "id": "QtoqUw80QDV0", "outputId": "4afaedc7-7063-4160-fca2-ecb71bad2e66" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/usr/local/lib/python3.10/dist-packages/trl/trainer/dpo_trainer.py:294: UserWarning: When using DPODataCollatorWithPadding, you should set `remove_unused_columns=False` in your TrainingArguments we have set it for you, but you should do it yourself in the future.\n", " warnings.warn(\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "f29440fa274c43809ce9b008f8a61ed9", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Map: 0%| | 0/309 [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from transformers import TrainingArguments\n", "from trl import DPOTrainer\n", "\n", "dpo_trainer = DPOTrainer(\n", " model = model,\n", " ref_model = None,\n", " args = TrainingArguments(\n", " per_device_train_batch_size = 2,\n", " gradient_accumulation_steps = 4,\n", " # warmup_ratio = 0.1,\n", " warmup_steps = 10,\n", " # num_train_epochs = 3,\n", " max_steps = 30,\n", " learning_rate = 5e-7,\n", " fp16 = not torch.cuda.is_bf16_supported(),\n", " bf16 = torch.cuda.is_bf16_supported(),\n", " logging_steps = 1,\n", " optim = \"adamw_8bit\",\n", " weight_decay = 0.0,\n", " lr_scheduler_type = \"linear\",\n", " seed = 42,\n", " output_dir = \"outputs\",\n", " ),\n", " beta = 0.1,\n", " train_dataset = raw_datasets[\"train\"],\n", " # eval_dataset = raw_datasets[\"test\"],\n", " tokenizer = tokenizer,\n", " max_length = 1024,\n", " max_prompt_length = 512,\n", ")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "background_save": true, "base_uri": "https://localhost:8080/", "height": 1000 }, "id": "EWGFqAo5Q2me", "outputId": "a943ab8b-3259-4f02-f7b6-a203f8aa9108" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Unsloth: `use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`\n", "Could not estimate the number of tokens of the input, floating-point operations will not be computed\n" ] }, { "data": { "text/html": [ "\n", "Step | \n", "Training Loss | \n", "rewards / chosen | \n", "rewards / rejected | \n", "rewards / accuracies | \n", "rewards / margins | \n", "logps / rejected | \n", "logps / chosen | \n", "logits / rejected | \n", "logits / chosen | \n", "
---|---|---|---|---|---|---|---|---|---|
1 | \n", "0.693100 | \n", "0.000000 | \n", "0.000000 | \n", "0.000000 | \n", "0.000000 | \n", "-297.338806 | \n", "-218.968842 | \n", "-2.758142 | \n", "-2.924523 | \n", "
2 | \n", "0.693100 | \n", "0.000000 | \n", "0.000000 | \n", "0.000000 | \n", "0.000000 | \n", "-237.602417 | \n", "-217.613892 | \n", "-2.731790 | \n", "-2.913610 | \n", "
3 | \n", "0.691600 | \n", "0.003199 | \n", "0.000052 | \n", "0.875000 | \n", "0.003147 | \n", "-172.792282 | \n", "-202.696640 | \n", "-2.464586 | \n", "-2.728166 | \n", "
4 | \n", "0.695100 | \n", "-0.001557 | \n", "0.002410 | \n", "0.125000 | \n", "-0.003966 | \n", "-117.720329 | \n", "-170.735352 | \n", "-2.592084 | \n", "-2.805901 | \n", "
5 | \n", "0.693600 | \n", "-0.001848 | \n", "-0.000951 | \n", "0.500000 | \n", "-0.000897 | \n", "-197.054352 | \n", "-338.753784 | \n", "-2.541833 | \n", "-2.483452 | \n", "
6 | \n", "0.693500 | \n", "0.001733 | \n", "0.002373 | \n", "0.375000 | \n", "-0.000640 | \n", "-279.668030 | \n", "-188.664688 | \n", "-2.870125 | \n", "-2.633015 | \n", "
7 | \n", "0.694400 | \n", "-0.001485 | \n", "0.000983 | \n", "0.375000 | \n", "-0.002468 | \n", "-157.641891 | \n", "-236.214722 | \n", "-2.510456 | \n", "-2.838329 | \n", "
8 | \n", "0.695900 | \n", "-0.001233 | \n", "0.004172 | \n", "0.250000 | \n", "-0.005405 | \n", "-451.754211 | \n", "-493.157379 | \n", "-2.676884 | \n", "-2.677660 | \n", "
9 | \n", "0.691900 | \n", "0.002781 | \n", "0.000274 | \n", "0.500000 | \n", "0.002507 | \n", "-198.049500 | \n", "-405.621704 | \n", "-2.366738 | \n", "-2.639533 | \n", "
10 | \n", "0.691900 | \n", "0.001583 | \n", "-0.000867 | \n", "0.750000 | \n", "0.002450 | \n", "-192.027969 | \n", "-238.243286 | \n", "-2.734776 | \n", "-2.765939 | \n", "
11 | \n", "0.692900 | \n", "0.000210 | \n", "-0.000282 | \n", "0.500000 | \n", "0.000492 | \n", "-279.060547 | \n", "-242.698013 | \n", "-2.741585 | \n", "-2.707133 | \n", "
12 | \n", "0.690700 | \n", "0.002133 | \n", "-0.002865 | \n", "0.875000 | \n", "0.004998 | \n", "-293.286072 | \n", "-297.428986 | \n", "-2.636093 | \n", "-2.797532 | \n", "
13 | \n", "0.693300 | \n", "0.000793 | \n", "0.001140 | \n", "0.375000 | \n", "-0.000347 | \n", "-422.437988 | \n", "-440.254791 | \n", "-2.782841 | \n", "-2.722314 | \n", "
14 | \n", "0.693800 | \n", "-0.001575 | \n", "-0.000311 | \n", "0.375000 | \n", "-0.001263 | \n", "-229.166489 | \n", "-517.634277 | \n", "-2.761792 | \n", "-2.876637 | \n", "
15 | \n", "0.693200 | \n", "0.001183 | \n", "0.001257 | \n", "0.250000 | \n", "-0.000073 | \n", "-229.948364 | \n", "-368.679657 | \n", "-2.687079 | \n", "-2.848794 | \n", "
16 | \n", "0.692600 | \n", "0.002548 | \n", "0.001384 | \n", "0.625000 | \n", "0.001164 | \n", "-153.684021 | \n", "-214.476501 | \n", "-2.553330 | \n", "-2.669835 | \n", "
17 | \n", "0.691300 | \n", "0.002753 | \n", "-0.000908 | \n", "0.875000 | \n", "0.003661 | \n", "-145.049057 | \n", "-178.315857 | \n", "-2.551767 | \n", "-2.745412 | \n", "
18 | \n", "0.693300 | \n", "-0.001510 | \n", "-0.001290 | \n", "0.375000 | \n", "-0.000220 | \n", "-193.493317 | \n", "-307.417603 | \n", "-2.546383 | \n", "-2.794016 | \n", "
19 | \n", "0.691700 | \n", "0.005141 | \n", "0.002240 | \n", "0.750000 | \n", "0.002900 | \n", "-248.701904 | \n", "-226.629242 | \n", "-2.847236 | \n", "-2.759187 | \n", "
20 | \n", "0.692500 | \n", "0.005021 | \n", "0.003700 | \n", "0.625000 | \n", "0.001321 | \n", "-265.856720 | \n", "-421.315674 | \n", "-2.905803 | \n", "-2.947715 | \n", "
21 | \n", "0.694900 | \n", "-0.000442 | \n", "0.002956 | \n", "0.375000 | \n", "-0.003398 | \n", "-156.539444 | \n", "-202.226196 | \n", "-2.447355 | \n", "-2.538493 | \n", "
22 | \n", "0.689800 | \n", "0.002207 | \n", "-0.004426 | \n", "0.625000 | \n", "0.006633 | \n", "-240.908142 | \n", "-258.819641 | \n", "-2.661672 | \n", "-2.760096 | \n", "
23 | \n", "0.687300 | \n", "0.007077 | \n", "-0.004723 | \n", "0.875000 | \n", "0.011800 | \n", "-172.381485 | \n", "-306.969055 | \n", "-2.421829 | \n", "-2.795570 | \n", "
24 | \n", "0.693600 | \n", "0.003591 | \n", "0.004461 | \n", "0.500000 | \n", "-0.000870 | \n", "-180.786728 | \n", "-225.256119 | \n", "-2.247471 | \n", "-2.371035 | \n", "
25 | \n", "0.688700 | \n", "0.002078 | \n", "-0.006781 | \n", "0.875000 | \n", "0.008858 | \n", "-109.244751 | \n", "-135.704803 | \n", "-2.537647 | \n", "-2.694038 | \n", "
26 | \n", "0.692300 | \n", "0.004020 | \n", "0.002325 | \n", "0.625000 | \n", "0.001695 | \n", "-218.140793 | \n", "-216.836838 | \n", "-2.350788 | \n", "-2.309595 | \n", "
27 | \n", "0.688500 | \n", "0.007711 | \n", "-0.001727 | \n", "0.875000 | \n", "0.009437 | \n", "-253.585098 | \n", "-276.549164 | \n", "-2.410436 | \n", "-2.620286 | \n", "
28 | \n", "0.693400 | \n", "0.001674 | \n", "0.002127 | \n", "0.375000 | \n", "-0.000453 | \n", "-241.141602 | \n", "-239.870850 | \n", "-2.711540 | \n", "-2.787073 | \n", "
29 | \n", "0.688400 | \n", "0.001827 | \n", "-0.007794 | \n", "0.750000 | \n", "0.009621 | \n", "-176.143845 | \n", "-202.277985 | \n", "-2.833032 | \n", "-2.738228 | \n", "
30 | \n", "0.682300 | \n", "0.017523 | \n", "-0.004449 | \n", "1.000000 | \n", "0.021972 | \n", "-289.187653 | \n", "-317.038971 | \n", "-2.557965 | \n", "-2.777354 | \n", "
31 | \n", "0.692000 | \n", "0.002931 | \n", "0.000476 | \n", "0.875000 | \n", "0.002455 | \n", "-250.866531 | \n", "-230.197632 | \n", "-2.658638 | \n", "-2.652516 | \n", "
32 | \n", "0.684700 | \n", "0.006979 | \n", "-0.010172 | \n", "0.750000 | \n", "0.017150 | \n", "-182.629639 | \n", "-287.150208 | \n", "-2.427922 | \n", "-2.839752 | \n", "
33 | \n", "0.688500 | \n", "0.010015 | \n", "0.000286 | \n", "0.750000 | \n", "0.009729 | \n", "-394.385437 | \n", "-513.798523 | \n", "-2.927996 | \n", "-2.907134 | \n", "
34 | \n", "0.687600 | \n", "0.000746 | \n", "-0.010427 | \n", "0.875000 | \n", "0.011174 | \n", "-213.292419 | \n", "-186.554916 | \n", "-2.781289 | \n", "-2.641831 | \n", "
35 | \n", "0.694400 | \n", "-0.000188 | \n", "0.002397 | \n", "0.500000 | \n", "-0.002585 | \n", "-210.052170 | \n", "-83.598724 | \n", "-2.734563 | \n", "-2.323755 | \n", "
36 | \n", "0.676700 | \n", "0.020572 | \n", "-0.012979 | \n", "0.750000 | \n", "0.033551 | \n", "-278.073486 | \n", "-431.871216 | \n", "-2.437477 | \n", "-2.656088 | \n", "
37 | \n", "0.690700 | \n", "0.004792 | \n", "-0.000168 | \n", "0.750000 | \n", "0.004959 | \n", "-141.254837 | \n", "-138.760544 | \n", "-2.480301 | \n", "-2.554194 | \n", "
38 | \n", "0.687100 | \n", "0.010404 | \n", "-0.001865 | \n", "0.875000 | \n", "0.012269 | \n", "-199.933395 | \n", "-166.619339 | \n", "-2.244387 | \n", "-2.343439 | \n", "
39 | \n", "0.685500 | \n", "0.004061 | \n", "-0.011404 | \n", "0.875000 | \n", "0.015465 | \n", "-230.984711 | \n", "-224.525558 | \n", "-2.661340 | \n", "-2.720686 | \n", "
40 | \n", "0.673000 | \n", "0.014156 | \n", "-0.026998 | \n", "1.000000 | \n", "0.041154 | \n", "-260.884521 | \n", "-176.705002 | \n", "-2.687092 | \n", "-2.736533 | \n", "
41 | \n", "0.681200 | \n", "0.013829 | \n", "-0.010190 | \n", "0.875000 | \n", "0.024019 | \n", "-183.548355 | \n", "-199.252533 | \n", "-2.620351 | \n", "-2.470730 | \n", "
42 | \n", "0.669200 | \n", "0.031616 | \n", "-0.017344 | \n", "0.875000 | \n", "0.048961 | \n", "-180.883789 | \n", "-351.244385 | \n", "-2.498347 | \n", "-2.774542 | \n", "
43 | \n", "0.685400 | \n", "0.003653 | \n", "-0.011942 | \n", "1.000000 | \n", "0.015595 | \n", "-128.027740 | \n", "-116.247787 | \n", "-2.753370 | \n", "-2.723312 | \n", "
44 | \n", "0.673900 | \n", "0.028433 | \n", "-0.011328 | \n", "0.875000 | \n", "0.039760 | \n", "-225.523666 | \n", "-322.012604 | \n", "-2.543076 | \n", "-2.699420 | \n", "
45 | \n", "0.662200 | \n", "0.034732 | \n", "-0.028547 | \n", "1.000000 | \n", "0.063279 | \n", "-237.588928 | \n", "-289.570129 | \n", "-2.249139 | \n", "-2.591806 | \n", "
46 | \n", "0.659000 | \n", "0.036781 | \n", "-0.032790 | \n", "1.000000 | \n", "0.069571 | \n", "-196.792236 | \n", "-265.052399 | \n", "-2.553383 | \n", "-2.715359 | \n", "
47 | \n", "0.672500 | \n", "0.029729 | \n", "-0.012267 | \n", "1.000000 | \n", "0.041996 | \n", "-139.111206 | \n", "-245.212021 | \n", "-2.533509 | \n", "-2.613470 | \n", "
48 | \n", "0.671100 | \n", "0.025400 | \n", "-0.019437 | \n", "1.000000 | \n", "0.044837 | \n", "-156.875183 | \n", "-217.185425 | \n", "-2.432833 | \n", "-2.575721 | \n", "
49 | \n", "0.661100 | \n", "0.035756 | \n", "-0.030402 | \n", "1.000000 | \n", "0.066158 | \n", "-213.846985 | \n", "-212.789627 | \n", "-2.647654 | \n", "-2.714958 | \n", "
50 | \n", "0.663200 | \n", "0.021139 | \n", "-0.040136 | \n", "1.000000 | \n", "0.061275 | \n", "-348.077393 | \n", "-312.627441 | \n", "-2.757344 | \n", "-2.675975 | \n", "
51 | \n", "0.653300 | \n", "0.038236 | \n", "-0.043319 | \n", "1.000000 | \n", "0.081555 | \n", "-395.500549 | \n", "-342.811859 | \n", "-2.567836 | \n", "-2.708135 | \n", "
52 | \n", "0.674300 | \n", "0.021404 | \n", "-0.016969 | \n", "1.000000 | \n", "0.038373 | \n", "-226.740570 | \n", "-203.308228 | \n", "-2.689174 | \n", "-2.713107 | \n", "
53 | \n", "0.675700 | \n", "0.022191 | \n", "-0.013347 | \n", "1.000000 | \n", "0.035538 | \n", "-127.214119 | \n", "-299.697296 | \n", "-2.689854 | \n", "-2.747266 | \n", "
54 | \n", "0.658000 | \n", "0.041239 | \n", "-0.030804 | \n", "1.000000 | \n", "0.072043 | \n", "-234.094452 | \n", "-341.478027 | \n", "-2.658128 | \n", "-2.845522 | \n", "
55 | \n", "0.678700 | \n", "-0.002169 | \n", "-0.031371 | \n", "1.000000 | \n", "0.029202 | \n", "-182.520050 | \n", "-130.642044 | \n", "-2.751769 | \n", "-2.653854 | \n", "
56 | \n", "0.659100 | \n", "0.035994 | \n", "-0.033855 | \n", "1.000000 | \n", "0.069849 | \n", "-206.904007 | \n", "-332.754486 | \n", "-2.633317 | \n", "-2.746063 | \n", "
57 | \n", "0.640800 | \n", "0.072543 | \n", "-0.035949 | \n", "0.875000 | \n", "0.108491 | \n", "-347.537323 | \n", "-594.679443 | \n", "-2.610380 | \n", "-2.938568 | \n", "
58 | \n", "0.656800 | \n", "0.031076 | \n", "-0.043405 | \n", "1.000000 | \n", "0.074480 | \n", "-242.280487 | \n", "-208.374329 | \n", "-2.675403 | \n", "-2.823531 | \n", "
59 | \n", "0.646500 | \n", "0.053982 | \n", "-0.043562 | \n", "1.000000 | \n", "0.097543 | \n", "-221.657608 | \n", "-250.691666 | \n", "-2.548671 | \n", "-2.671937 | \n", "
60 | \n", "0.651700 | \n", "0.044328 | \n", "-0.040983 | \n", "1.000000 | \n", "0.085311 | \n", "-237.457428 | \n", "-243.843521 | \n", "-2.340865 | \n", "-2.417122 | \n", "
61 | \n", "0.668700 | \n", "0.016157 | \n", "-0.033756 | \n", "0.875000 | \n", "0.049913 | \n", "-181.521423 | \n", "-230.741577 | \n", "-2.583697 | \n", "-2.671347 | \n", "
62 | \n", "0.650500 | \n", "0.042586 | \n", "-0.045736 | \n", "1.000000 | \n", "0.088322 | \n", "-227.009842 | \n", "-248.078140 | \n", "-2.601808 | \n", "-2.666926 | \n", "
63 | \n", "0.666300 | \n", "0.039459 | \n", "-0.015492 | \n", "0.875000 | \n", "0.054951 | \n", "-239.231415 | \n", "-254.349426 | \n", "-2.788095 | \n", "-2.922115 | \n", "
64 | \n", "0.660100 | \n", "0.036744 | \n", "-0.031429 | \n", "0.875000 | \n", "0.068173 | \n", "-255.638794 | \n", "-338.323608 | \n", "-2.834287 | \n", "-2.895520 | \n", "
65 | \n", "0.655800 | \n", "0.030527 | \n", "-0.046443 | \n", "1.000000 | \n", "0.076970 | \n", "-236.972427 | \n", "-329.191803 | \n", "-2.574853 | \n", "-2.719908 | \n", "
66 | \n", "0.669700 | \n", "0.016352 | \n", "-0.032069 | \n", "0.875000 | \n", "0.048421 | \n", "-113.009361 | \n", "-182.456314 | \n", "-2.535565 | \n", "-2.650409 | \n", "
67 | \n", "0.656500 | \n", "0.035327 | \n", "-0.039927 | \n", "1.000000 | \n", "0.075254 | \n", "-296.490997 | \n", "-225.676331 | \n", "-2.226444 | \n", "-2.276212 | \n", "
68 | \n", "0.650100 | \n", "0.041216 | \n", "-0.048575 | \n", "1.000000 | \n", "0.089791 | \n", "-187.233826 | \n", "-246.121643 | \n", "-2.574435 | \n", "-2.739054 | \n", "
69 | \n", "0.654900 | \n", "0.051283 | \n", "-0.028192 | \n", "0.875000 | \n", "0.079475 | \n", "-308.786011 | \n", "-250.258072 | \n", "-2.803828 | \n", "-2.665385 | \n", "
70 | \n", "0.656100 | \n", "0.053900 | \n", "-0.022747 | \n", "0.875000 | \n", "0.076647 | \n", "-225.490479 | \n", "-310.644623 | \n", "-2.522727 | \n", "-2.745466 | \n", "
71 | \n", "0.662800 | \n", "0.037783 | \n", "-0.024222 | \n", "1.000000 | \n", "0.062006 | \n", "-209.903839 | \n", "-270.441864 | \n", "-2.677671 | \n", "-2.789300 | \n", "
72 | \n", "0.660200 | \n", "0.043013 | \n", "-0.024264 | \n", "1.000000 | \n", "0.067276 | \n", "-260.596680 | \n", "-256.450012 | \n", "-2.888245 | \n", "-2.899030 | \n", "
73 | \n", "0.657100 | \n", "0.063199 | \n", "-0.011027 | \n", "0.875000 | \n", "0.074227 | \n", "-369.544830 | \n", "-382.099243 | \n", "-2.595025 | \n", "-2.764428 | \n", "
74 | \n", "0.637500 | \n", "0.069101 | \n", "-0.047402 | \n", "0.875000 | \n", "0.116503 | \n", "-141.736923 | \n", "-422.295044 | \n", "-2.328428 | \n", "-2.619377 | \n", "
75 | \n", "0.646800 | \n", "0.034380 | \n", "-0.061420 | \n", "1.000000 | \n", "0.095800 | \n", "-338.911743 | \n", "-257.112823 | \n", "-2.674836 | \n", "-2.631948 | \n", "
76 | \n", "0.680100 | \n", "0.016118 | \n", "-0.010262 | \n", "0.750000 | \n", "0.026380 | \n", "-153.983215 | \n", "-192.856201 | \n", "-2.583967 | \n", "-2.739872 | \n", "
77 | \n", "0.653500 | \n", "0.037388 | \n", "-0.044784 | \n", "0.875000 | \n", "0.082172 | \n", "-237.917175 | \n", "-373.211090 | \n", "-2.670840 | \n", "-2.840366 | \n", "
78 | \n", "0.682200 | \n", "0.005111 | \n", "-0.017031 | \n", "1.000000 | \n", "0.022142 | \n", "-90.056511 | \n", "-75.676575 | \n", "-2.589334 | \n", "-2.678804 | \n", "
79 | \n", "0.649200 | \n", "0.048614 | \n", "-0.042461 | \n", "1.000000 | \n", "0.091075 | \n", "-197.482025 | \n", "-216.437912 | \n", "-2.580309 | \n", "-2.688187 | \n", "
80 | \n", "0.638100 | \n", "0.067506 | \n", "-0.047359 | \n", "1.000000 | \n", "0.114865 | \n", "-306.641571 | \n", "-243.717010 | \n", "-2.727369 | \n", "-2.713167 | \n", "
81 | \n", "0.630400 | \n", "0.046541 | \n", "-0.085917 | \n", "1.000000 | \n", "0.132458 | \n", "-205.164703 | \n", "-226.345978 | \n", "-2.571737 | \n", "-2.649376 | \n", "
82 | \n", "0.639900 | \n", "0.046804 | \n", "-0.063667 | \n", "1.000000 | \n", "0.110471 | \n", "-255.351227 | \n", "-232.954865 | \n", "-2.444653 | \n", "-2.546993 | \n", "
83 | \n", "0.647900 | \n", "0.043133 | \n", "-0.050545 | \n", "1.000000 | \n", "0.093678 | \n", "-143.658051 | \n", "-267.885437 | \n", "-2.450764 | \n", "-2.622156 | \n", "
84 | \n", "0.623400 | \n", "0.066124 | \n", "-0.081565 | \n", "1.000000 | \n", "0.147689 | \n", "-269.404785 | \n", "-331.960785 | \n", "-2.679103 | \n", "-2.777452 | \n", "
85 | \n", "0.625300 | \n", "0.093795 | \n", "-0.048601 | \n", "1.000000 | \n", "0.142396 | \n", "-250.066422 | \n", "-298.300201 | \n", "-2.747368 | \n", "-2.852850 | \n", "
86 | \n", "0.652200 | \n", "0.047347 | \n", "-0.037443 | \n", "1.000000 | \n", "0.084790 | \n", "-102.290405 | \n", "-187.495880 | \n", "-2.346936 | \n", "-2.561124 | \n", "
87 | \n", "0.597100 | \n", "0.108838 | \n", "-0.094947 | \n", "1.000000 | \n", "0.203784 | \n", "-415.374237 | \n", "-581.638367 | \n", "-2.975214 | \n", "-2.911619 | \n", "
88 | \n", "0.609600 | \n", "0.089707 | \n", "-0.088418 | \n", "1.000000 | \n", "0.178125 | \n", "-398.111847 | \n", "-336.481079 | \n", "-2.793868 | \n", "-2.568107 | \n", "
89 | \n", "0.635900 | \n", "0.066105 | \n", "-0.053545 | \n", "1.000000 | \n", "0.119650 | \n", "-250.390030 | \n", "-384.719116 | \n", "-2.739915 | \n", "-2.677882 | \n", "
90 | \n", "0.611000 | \n", "0.094179 | \n", "-0.079789 | \n", "1.000000 | \n", "0.173968 | \n", "-382.315125 | \n", "-394.539276 | \n", "-2.684922 | \n", "-2.875659 | \n", "
91 | \n", "0.625400 | \n", "0.085292 | \n", "-0.056867 | \n", "1.000000 | \n", "0.142159 | \n", "-205.984009 | \n", "-335.645020 | \n", "-2.549550 | \n", "-2.717720 | \n", "
92 | \n", "0.621800 | \n", "0.068006 | \n", "-0.081611 | \n", "1.000000 | \n", "0.149617 | \n", "-339.634094 | \n", "-360.694153 | \n", "-2.709456 | \n", "-2.903818 | \n", "
93 | \n", "0.655700 | \n", "0.047537 | \n", "-0.029553 | \n", "1.000000 | \n", "0.077090 | \n", "-115.142799 | \n", "-260.806763 | \n", "-2.594357 | \n", "-2.715160 | \n", "
94 | \n", "0.639600 | \n", "0.055383 | \n", "-0.057890 | \n", "1.000000 | \n", "0.113273 | \n", "-209.325363 | \n", "-226.800201 | \n", "-2.263216 | \n", "-2.443685 | \n", "
95 | \n", "0.644600 | \n", "0.045497 | \n", "-0.055599 | \n", "1.000000 | \n", "0.101096 | \n", "-227.701065 | \n", "-268.792511 | \n", "-2.571428 | \n", "-2.790503 | \n", "
96 | \n", "0.662700 | \n", "0.024437 | \n", "-0.037891 | \n", "0.875000 | \n", "0.062327 | \n", "-188.535080 | \n", "-184.081863 | \n", "-2.675784 | \n", "-2.853757 | \n", "
97 | \n", "0.634000 | \n", "0.036011 | \n", "-0.089001 | \n", "1.000000 | \n", "0.125011 | \n", "-203.472717 | \n", "-158.047913 | \n", "-2.858229 | \n", "-2.802696 | \n", "
98 | \n", "0.645200 | \n", "0.041265 | \n", "-0.058630 | \n", "0.875000 | \n", "0.099896 | \n", "-147.259262 | \n", "-241.107040 | \n", "-2.673885 | \n", "-2.728743 | \n", "
99 | \n", "0.657400 | \n", "0.027796 | \n", "-0.045903 | \n", "0.875000 | \n", "0.073699 | \n", "-152.992294 | \n", "-178.524979 | \n", "-2.415729 | \n", "-2.726150 | \n", "
100 | \n", "0.631500 | \n", "0.043453 | \n", "-0.085162 | \n", "1.000000 | \n", "0.128614 | \n", "-193.779343 | \n", "-156.529266 | \n", "-2.446479 | \n", "-2.537483 | \n", "
101 | \n", "0.644700 | \n", "0.059261 | \n", "-0.042216 | \n", "1.000000 | \n", "0.101477 | \n", "-239.048523 | \n", "-249.522369 | \n", "-2.556858 | \n", "-2.802905 | \n", "
102 | \n", "0.635600 | \n", "0.072766 | \n", "-0.048541 | \n", "0.875000 | \n", "0.121307 | \n", "-247.841553 | \n", "-236.061722 | \n", "-2.710063 | \n", "-2.557176 | \n", "
103 | \n", "0.656500 | \n", "0.035755 | \n", "-0.039488 | \n", "1.000000 | \n", "0.075243 | \n", "-145.012451 | \n", "-165.547180 | \n", "-2.728357 | \n", "-2.772579 | \n", "
104 | \n", "0.652900 | \n", "0.039176 | \n", "-0.043977 | \n", "0.875000 | \n", "0.083153 | \n", "-198.680023 | \n", "-231.717743 | \n", "-2.430846 | \n", "-2.528418 | \n", "
105 | \n", "0.616600 | \n", "0.104496 | \n", "-0.057214 | \n", "1.000000 | \n", "0.161710 | \n", "-248.830582 | \n", "-398.110840 | \n", "-2.472265 | \n", "-2.786351 | \n", "
106 | \n", "0.640800 | \n", "0.069526 | \n", "-0.039345 | \n", "1.000000 | \n", "0.108871 | \n", "-230.877518 | \n", "-378.959503 | \n", "-2.584360 | \n", "-2.771723 | \n", "
107 | \n", "0.627900 | \n", "0.077549 | \n", "-0.060203 | \n", "1.000000 | \n", "0.137752 | \n", "-184.436127 | \n", "-331.978729 | \n", "-2.613068 | \n", "-2.810034 | \n", "
108 | \n", "0.641100 | \n", "0.062340 | \n", "-0.045482 | \n", "1.000000 | \n", "0.107821 | \n", "-193.510544 | \n", "-236.701370 | \n", "-2.472780 | \n", "-2.615651 | \n", "
109 | \n", "0.639600 | \n", "0.063451 | \n", "-0.050169 | \n", "0.875000 | \n", "0.113619 | \n", "-174.547577 | \n", "-269.642944 | \n", "-2.326431 | \n", "-2.456072 | \n", "
110 | \n", "0.650800 | \n", "0.047151 | \n", "-0.041271 | \n", "1.000000 | \n", "0.088421 | \n", "-173.169586 | \n", "-174.600586 | \n", "-2.659404 | \n", "-2.637673 | \n", "
111 | \n", "0.628900 | \n", "0.089443 | \n", "-0.045357 | \n", "0.875000 | \n", "0.134800 | \n", "-292.942871 | \n", "-368.299500 | \n", "-2.448925 | \n", "-2.652367 | \n", "
112 | \n", "0.672400 | \n", "0.028686 | \n", "-0.013418 | \n", "0.875000 | \n", "0.042104 | \n", "-92.405304 | \n", "-134.066833 | \n", "-2.416434 | \n", "-2.578195 | \n", "
113 | \n", "0.673100 | \n", "0.017251 | \n", "-0.023517 | \n", "1.000000 | \n", "0.040768 | \n", "-104.922539 | \n", "-79.426987 | \n", "-2.740997 | \n", "-2.566977 | \n", "
114 | \n", "0.618700 | \n", "0.071029 | \n", "-0.087080 | \n", "0.875000 | \n", "0.158109 | \n", "-322.882080 | \n", "-338.982452 | \n", "-2.519094 | \n", "-2.479210 | \n", "
"
],
"text/plain": [
"