Commit 
							
							·
						
						17edb15
	
1
								Parent(s):
							
							b444d92
								
Upload Phi-3.5-mini-instruct ONNX models
Browse files- LICENSE +22 -223
 - config.json +108 -5
 - configuration_phi3.py +227 -213
 - cuda/cuda-fp16/.azDownload-8f9058cd-a0ce-7c4d-6eea-9a48f67f2ddc-phi-3.5-mini-instruct-cuda-fp16.onnx.data +0 -0
 - cuda/cuda-fp16/phi-3.5-mini-instruct-cuda-fp16.onnx.data +3 -0
 - cuda/cuda-int4-awq-block-128/.azDownload-8f9058cd-a0ce-7c4d-6eea-9a48f67f2ddc-phi-3.5-mini-instruct-cuda-int4-awq-block-128.onnx.data +0 -0
 - cuda/cuda-int4-awq-block-128/phi-3.5-mini-instruct-cuda-int4-awq-block-128.onnx.data +3 -0
 
    	
        LICENSE
    CHANGED
    
    | 
         @@ -1,223 +1,22 @@ 
     | 
|
| 1 | 
         
            -
             
     | 
| 2 | 
         
            -
             
     | 
| 3 | 
         
            -
             
     | 
| 4 | 
         
            -
             
     | 
| 5 | 
         
            -
             
     | 
| 6 | 
         
            -
             
     | 
| 7 | 
         
            -
             
     | 
| 8 | 
         
            -
             
     | 
| 9 | 
         
            -
             
     | 
| 10 | 
         
            -
             
     | 
| 11 | 
         
            -
             
     | 
| 12 | 
         
            -
             
     | 
| 13 | 
         
            -
             
     | 
| 14 | 
         
            -
             
     | 
| 15 | 
         
            -
             
     | 
| 16 | 
         
            -
             
     | 
| 17 | 
         
            -
             
     | 
| 18 | 
         
            -
             
     | 
| 19 | 
         
            -
             
     | 
| 20 | 
         
            -
             
     | 
| 21 | 
         
            -
             
     | 
| 22 | 
         
            -
             
     | 
| 23 | 
         
            -
                                            Apache License
         
     | 
| 24 | 
         
            -
                                       Version 2.0, January 2004
         
     | 
| 25 | 
         
            -
                                    http://www.apache.org/licenses/
         
     | 
| 26 | 
         
            -
             
     | 
| 27 | 
         
            -
               TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
         
     | 
| 28 | 
         
            -
             
     | 
| 29 | 
         
            -
               1. Definitions.
         
     | 
| 30 | 
         
            -
             
     | 
| 31 | 
         
            -
                  "License" shall mean the terms and conditions for use, reproduction,
         
     | 
| 32 | 
         
            -
                  and distribution as defined by Sections 1 through 9 of this document.
         
     | 
| 33 | 
         
            -
             
     | 
| 34 | 
         
            -
                  "Licensor" shall mean the copyright owner or entity authorized by
         
     | 
| 35 | 
         
            -
                  the copyright owner that is granting the License.
         
     | 
| 36 | 
         
            -
             
     | 
| 37 | 
         
            -
                  "Legal Entity" shall mean the union of the acting entity and all
         
     | 
| 38 | 
         
            -
                  other entities that control, are controlled by, or are under common
         
     | 
| 39 | 
         
            -
                  control with that entity. For the purposes of this definition,
         
     | 
| 40 | 
         
            -
                  "control" means (i) the power, direct or indirect, to cause the
         
     | 
| 41 | 
         
            -
                  direction or management of such entity, whether by contract or
         
     | 
| 42 | 
         
            -
                  otherwise, or (ii) ownership of fifty percent (50%) or more of the
         
     | 
| 43 | 
         
            -
                  outstanding shares, or (iii) beneficial ownership of such entity.
         
     | 
| 44 | 
         
            -
             
     | 
| 45 | 
         
            -
                  "You" (or "Your") shall mean an individual or Legal Entity
         
     | 
| 46 | 
         
            -
                  exercising permissions granted by this License.
         
     | 
| 47 | 
         
            -
             
     | 
| 48 | 
         
            -
                  "Source" form shall mean the preferred form for making modifications,
         
     | 
| 49 | 
         
            -
                  including but not limited to software source code, documentation
         
     | 
| 50 | 
         
            -
                  source, and configuration files.
         
     | 
| 51 | 
         
            -
             
     | 
| 52 | 
         
            -
                  "Object" form shall mean any form resulting from mechanical
         
     | 
| 53 | 
         
            -
                  transformation or translation of a Source form, including but
         
     | 
| 54 | 
         
            -
                  not limited to compiled object code, generated documentation,
         
     | 
| 55 | 
         
            -
                  and conversions to other media types.
         
     | 
| 56 | 
         
            -
             
     | 
| 57 | 
         
            -
                  "Work" shall mean the work of authorship, whether in Source or
         
     | 
| 58 | 
         
            -
                  Object form, made available under the License, as indicated by a
         
     | 
| 59 | 
         
            -
                  copyright notice that is included in or attached to the work
         
     | 
| 60 | 
         
            -
                  (an example is provided in the Appendix below).
         
     | 
| 61 | 
         
            -
             
     | 
| 62 | 
         
            -
                  "Derivative Works" shall mean any work, whether in Source or Object
         
     | 
| 63 | 
         
            -
                  form, that is based on (or derived from) the Work and for which the
         
     | 
| 64 | 
         
            -
                  editorial revisions, annotations, elaborations, or other modifications
         
     | 
| 65 | 
         
            -
                  represent, as a whole, an original work of authorship. For the purposes
         
     | 
| 66 | 
         
            -
                  of this License, Derivative Works shall not include works that remain
         
     | 
| 67 | 
         
            -
                  separable from, or merely link (or bind by name) to the interfaces of,
         
     | 
| 68 | 
         
            -
                  the Work and Derivative Works thereof.
         
     | 
| 69 | 
         
            -
             
     | 
| 70 | 
         
            -
                  "Contribution" shall mean any work of authorship, including
         
     | 
| 71 | 
         
            -
                  the original version of the Work and any modifications or additions
         
     | 
| 72 | 
         
            -
                  to that Work or Derivative Works thereof, that is intentionally
         
     | 
| 73 | 
         
            -
                  submitted to Licensor for inclusion in the Work by the copyright owner
         
     | 
| 74 | 
         
            -
                  or by an individual or Legal Entity authorized to submit on behalf of
         
     | 
| 75 | 
         
            -
                  the copyright owner. For the purposes of this definition, "submitted"
         
     | 
| 76 | 
         
            -
                  means any form of electronic, verbal, or written communication sent
         
     | 
| 77 | 
         
            -
                  to the Licensor or its representatives, including but not limited to
         
     | 
| 78 | 
         
            -
                  communication on electronic mailing lists, source code control systems,
         
     | 
| 79 | 
         
            -
                  and issue tracking systems that are managed by, or on behalf of, the
         
     | 
| 80 | 
         
            -
                  Licensor for the purpose of discussing and improving the Work, but
         
     | 
| 81 | 
         
            -
                  excluding communication that is conspicuously marked or otherwise
         
     | 
| 82 | 
         
            -
                  designated in writing by the copyright owner as "Not a Contribution."
         
     | 
| 83 | 
         
            -
             
     | 
| 84 | 
         
            -
                  "Contributor" shall mean Licensor and any individual or Legal Entity
         
     | 
| 85 | 
         
            -
                  on behalf of whom a Contribution has been received by Licensor and
         
     | 
| 86 | 
         
            -
                  subsequently incorporated within the Work.
         
     | 
| 87 | 
         
            -
             
     | 
| 88 | 
         
            -
               2. Grant of Copyright License. Subject to the terms and conditions of
         
     | 
| 89 | 
         
            -
                  this License, each Contributor hereby grants to You a perpetual,
         
     | 
| 90 | 
         
            -
                  worldwide, non-exclusive, no-charge, royalty-free, irrevocable
         
     | 
| 91 | 
         
            -
                  copyright license to reproduce, prepare Derivative Works of,
         
     | 
| 92 | 
         
            -
                  publicly display, publicly perform, sublicense, and distribute the
         
     | 
| 93 | 
         
            -
                  Work and such Derivative Works in Source or Object form.
         
     | 
| 94 | 
         
            -
             
     | 
| 95 | 
         
            -
               3. Grant of Patent License. Subject to the terms and conditions of
         
     | 
| 96 | 
         
            -
                  this License, each Contributor hereby grants to You a perpetual,
         
     | 
| 97 | 
         
            -
                  worldwide, non-exclusive, no-charge, royalty-free, irrevocable
         
     | 
| 98 | 
         
            -
                  (except as stated in this section) patent license to make, have made,
         
     | 
| 99 | 
         
            -
                  use, offer to sell, sell, import, and otherwise transfer the Work,
         
     | 
| 100 | 
         
            -
                  where such license applies only to those patent claims licensable
         
     | 
| 101 | 
         
            -
                  by such Contributor that are necessarily infringed by their
         
     | 
| 102 | 
         
            -
                  Contribution(s) alone or by combination of their Contribution(s)
         
     | 
| 103 | 
         
            -
                  with the Work to which such Contribution(s) was submitted. If You
         
     | 
| 104 | 
         
            -
                  institute patent litigation against any entity (including a
         
     | 
| 105 | 
         
            -
                  cross-claim or counterclaim in a lawsuit) alleging that the Work
         
     | 
| 106 | 
         
            -
                  or a Contribution incorporated within the Work constitutes direct
         
     | 
| 107 | 
         
            -
                  or contributory patent infringement, then any patent licenses
         
     | 
| 108 | 
         
            -
                  granted to You under this License for that Work shall terminate
         
     | 
| 109 | 
         
            -
                  as of the date such litigation is filed.
         
     | 
| 110 | 
         
            -
             
     | 
| 111 | 
         
            -
               4. Redistribution. You may reproduce and distribute copies of the
         
     | 
| 112 | 
         
            -
                  Work or Derivative Works thereof in any medium, with or without
         
     | 
| 113 | 
         
            -
                  modifications, and in Source or Object form, provided that You
         
     | 
| 114 | 
         
            -
                  meet the following conditions:
         
     | 
| 115 | 
         
            -
             
     | 
| 116 | 
         
            -
                  (a) You must give any other recipients of the Work or
         
     | 
| 117 | 
         
            -
                      Derivative Works a copy of this License; and
         
     | 
| 118 | 
         
            -
             
     | 
| 119 | 
         
            -
                  (b) You must cause any modified files to carry prominent notices
         
     | 
| 120 | 
         
            -
                      stating that You changed the files; and
         
     | 
| 121 | 
         
            -
             
     | 
| 122 | 
         
            -
                  (c) You must retain, in the Source form of any Derivative Works
         
     | 
| 123 | 
         
            -
                      that You distribute, all copyright, patent, trademark, and
         
     | 
| 124 | 
         
            -
                      attribution notices from the Source form of the Work,
         
     | 
| 125 | 
         
            -
                      excluding those notices that do not pertain to any part of
         
     | 
| 126 | 
         
            -
                      the Derivative Works; and
         
     | 
| 127 | 
         
            -
             
     | 
| 128 | 
         
            -
                  (d) If the Work includes a "NOTICE" text file as part of its
         
     | 
| 129 | 
         
            -
                      distribution, then any Derivative Works that You distribute must
         
     | 
| 130 | 
         
            -
                      include a readable copy of the attribution notices contained
         
     | 
| 131 | 
         
            -
                      within such NOTICE file, excluding those notices that do not
         
     | 
| 132 | 
         
            -
                      pertain to any part of the Derivative Works, in at least one
         
     | 
| 133 | 
         
            -
                      of the following places: within a NOTICE text file distributed
         
     | 
| 134 | 
         
            -
                      as part of the Derivative Works; within the Source form or
         
     | 
| 135 | 
         
            -
                      documentation, if provided along with the Derivative Works; or,
         
     | 
| 136 | 
         
            -
                      within a display generated by the Derivative Works, if and
         
     | 
| 137 | 
         
            -
                      wherever such third-party notices normally appear. The contents
         
     | 
| 138 | 
         
            -
                      of the NOTICE file are for informational purposes only and
         
     | 
| 139 | 
         
            -
                      do not modify the License. You may add Your own attribution
         
     | 
| 140 | 
         
            -
                      notices within Derivative Works that You distribute, alongside
         
     | 
| 141 | 
         
            -
                      or as an addendum to the NOTICE text from the Work, provided
         
     | 
| 142 | 
         
            -
                      that such additional attribution notices cannot be construed
         
     | 
| 143 | 
         
            -
                      as modifying the License.
         
     | 
| 144 | 
         
            -
             
     | 
| 145 | 
         
            -
                  You may add Your own copyright statement to Your modifications and
         
     | 
| 146 | 
         
            -
                  may provide additional or different license terms and conditions
         
     | 
| 147 | 
         
            -
                  for use, reproduction, or distribution of Your modifications, or
         
     | 
| 148 | 
         
            -
                  for any such Derivative Works as a whole, provided Your use,
         
     | 
| 149 | 
         
            -
                  reproduction, and distribution of the Work otherwise complies with
         
     | 
| 150 | 
         
            -
                  the conditions stated in this License.
         
     | 
| 151 | 
         
            -
             
     | 
| 152 | 
         
            -
               5. Submission of Contributions. Unless You explicitly state otherwise,
         
     | 
| 153 | 
         
            -
                  any Contribution intentionally submitted for inclusion in the Work
         
     | 
| 154 | 
         
            -
                  by You to the Licensor shall be under the terms and conditions of
         
     | 
| 155 | 
         
            -
                  this License, without any additional terms or conditions.
         
     | 
| 156 | 
         
            -
                  Notwithstanding the above, nothing herein shall supersede or modify
         
     | 
| 157 | 
         
            -
                  the terms of any separate license agreement you may have executed
         
     | 
| 158 | 
         
            -
                  with Licensor regarding such Contributions.
         
     | 
| 159 | 
         
            -
             
     | 
| 160 | 
         
            -
               6. Trademarks. This License does not grant permission to use the trade
         
     | 
| 161 | 
         
            -
                  names, trademarks, service marks, or product names of the Licensor,
         
     | 
| 162 | 
         
            -
                  except as required for reasonable and customary use in describing the
         
     | 
| 163 | 
         
            -
                  origin of the Work and reproducing the content of the NOTICE file.
         
     | 
| 164 | 
         
            -
             
     | 
| 165 | 
         
            -
               7. Disclaimer of Warranty. Unless required by applicable law or
         
     | 
| 166 | 
         
            -
                  agreed to in writing, Licensor provides the Work (and each
         
     | 
| 167 | 
         
            -
                  Contributor provides its Contributions) on an "AS IS" BASIS,
         
     | 
| 168 | 
         
            -
                  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
         
     | 
| 169 | 
         
            -
                  implied, including, without limitation, any warranties or conditions
         
     | 
| 170 | 
         
            -
                  of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
         
     | 
| 171 | 
         
            -
                  PARTICULAR PURPOSE. You are solely responsible for determining the
         
     | 
| 172 | 
         
            -
                  appropriateness of using or redistributing the Work and assume any
         
     | 
| 173 | 
         
            -
                  risks associated with Your exercise of permissions under this License.
         
     | 
| 174 | 
         
            -
             
     | 
| 175 | 
         
            -
               8. Limitation of Liability. In no event and under no legal theory,
         
     | 
| 176 | 
         
            -
                  whether in tort (including negligence), contract, or otherwise,
         
     | 
| 177 | 
         
            -
                  unless required by applicable law (such as deliberate and grossly
         
     | 
| 178 | 
         
            -
                  negligent acts) or agreed to in writing, shall any Contributor be
         
     | 
| 179 | 
         
            -
                  liable to You for damages, including any direct, indirect, special,
         
     | 
| 180 | 
         
            -
                  incidental, or consequential damages of any character arising as a
         
     | 
| 181 | 
         
            -
                  result of this License or out of the use or inability to use the
         
     | 
| 182 | 
         
            -
                  Work (including but not limited to damages for loss of goodwill,
         
     | 
| 183 | 
         
            -
                  work stoppage, computer failure or malfunction, or any and all
         
     | 
| 184 | 
         
            -
                  other commercial damages or losses), even if such Contributor
         
     | 
| 185 | 
         
            -
                  has been advised of the possibility of such damages.
         
     | 
| 186 | 
         
            -
             
     | 
| 187 | 
         
            -
               9. Accepting Warranty or Additional Liability. While redistributing
         
     | 
| 188 | 
         
            -
                  the Work or Derivative Works thereof, You may choose to offer,
         
     | 
| 189 | 
         
            -
                  and charge a fee for, acceptance of support, warranty, indemnity,
         
     | 
| 190 | 
         
            -
                  or other liability obligations and/or rights consistent with this
         
     | 
| 191 | 
         
            -
                  License. However, in accepting such obligations, You may act only
         
     | 
| 192 | 
         
            -
                  on Your own behalf and on Your sole responsibility, not on behalf
         
     | 
| 193 | 
         
            -
                  of any other Contributor, and only if You agree to indemnify,
         
     | 
| 194 | 
         
            -
                  defend, and hold each Contributor harmless for any liability
         
     | 
| 195 | 
         
            -
                  incurred by, or claims asserted against, such Contributor by reason
         
     | 
| 196 | 
         
            -
                  of your accepting any such warranty or additional liability.
         
     | 
| 197 | 
         
            -
             
     | 
| 198 | 
         
            -
               END OF TERMS AND CONDITIONS
         
     | 
| 199 | 
         
            -
             
     | 
| 200 | 
         
            -
               ============================================================================
         
     | 
| 201 | 
         
            -
             
     | 
| 202 | 
         
            -
               Copyright 2016-2019 Intel Corporation
         
     | 
| 203 | 
         
            -
               Copyright 2018 YANDEX LLC
         
     | 
| 204 | 
         
            -
             
     | 
| 205 | 
         
            -
               Licensed under the Apache License, Version 2.0 (the "License");
         
     | 
| 206 | 
         
            -
               you may not use this file except in compliance with the License.
         
     | 
| 207 | 
         
            -
               You may obtain a copy of the License at
         
     | 
| 208 | 
         
            -
             
     | 
| 209 | 
         
            -
                   http://www.apache.org/licenses/LICENSE-2.0
         
     | 
| 210 | 
         
            -
             
     | 
| 211 | 
         
            -
               Unless required by applicable law or agreed to in writing, software
         
     | 
| 212 | 
         
            -
               distributed under the License is distributed on an "AS IS" BASIS,
         
     | 
| 213 | 
         
            -
               WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
         
     | 
| 214 | 
         
            -
               See the License for the specific language governing permissions and
         
     | 
| 215 | 
         
            -
               limitations under the License.
         
     | 
| 216 | 
         
            -
             
     | 
| 217 | 
         
            -
               This distribution includes third party software ("third party programs").
         
     | 
| 218 | 
         
            -
               This third party software, even if included with the distribution of
         
     | 
| 219 | 
         
            -
               the Intel software, may be governed by separate license terms, including
         
     | 
| 220 | 
         
            -
               without limitation, third party license terms, other Intel software license
         
     | 
| 221 | 
         
            -
               terms, and open source software license terms. These separate license terms
         
     | 
| 222 | 
         
            -
               govern your use of the third party programs as set forth in the
         
     | 
| 223 | 
         
            -
               "THIRD-PARTY-PROGRAMS" file.
         
     | 
| 
         | 
|
| 1 | 
         
            +
            Microsoft.
         
     | 
| 2 | 
         
            +
            Copyright (c) Microsoft Corporation.
         
     | 
| 3 | 
         
            +
             
     | 
| 4 | 
         
            +
            MIT License
         
     | 
| 5 | 
         
            +
             
     | 
| 6 | 
         
            +
            Permission is hereby granted, free of charge, to any person obtaining a copy
         
     | 
| 7 | 
         
            +
            of this software and associated documentation files (the "Software"), to deal
         
     | 
| 8 | 
         
            +
            in the Software without restriction, including without limitation the rights
         
     | 
| 9 | 
         
            +
            to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
         
     | 
| 10 | 
         
            +
            copies of the Software, and to permit persons to whom the Software is
         
     | 
| 11 | 
         
            +
            furnished to do so, subject to the following conditions:
         
     | 
| 12 | 
         
            +
             
     | 
| 13 | 
         
            +
            The above copyright notice and this permission notice shall be included in all
         
     | 
| 14 | 
         
            +
            copies or substantial portions of the Software.
         
     | 
| 15 | 
         
            +
             
     | 
| 16 | 
         
            +
            THE SOFTWARE IS PROVIDED *AS IS*, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
         
     | 
| 17 | 
         
            +
            IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
         
     | 
| 18 | 
         
            +
            FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
         
     | 
| 19 | 
         
            +
            AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
         
     | 
| 20 | 
         
            +
            LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
         
     | 
| 21 | 
         
            +
            OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
         
     | 
| 22 | 
         
            +
            SOFTWARE.
         
     | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
    	
        config.json
    CHANGED
    
    | 
         @@ -1,5 +1,5 @@ 
     | 
|
| 1 | 
         
             
            {
         
     | 
| 2 | 
         
            -
              "_name_or_path": " 
     | 
| 3 | 
         
             
              "architectures": [
         
     | 
| 4 | 
         
             
                "Phi3ForCausalLM"
         
     | 
| 5 | 
         
             
              ],
         
     | 
| 
         @@ -15,7 +15,7 @@ 
     | 
|
| 15 | 
         
             
              "hidden_size": 3072,
         
     | 
| 16 | 
         
             
              "initializer_range": 0.02,
         
     | 
| 17 | 
         
             
              "intermediate_size": 8192,
         
     | 
| 18 | 
         
            -
              "max_position_embeddings":  
     | 
| 19 | 
         
             
              "model_type": "phi3",
         
     | 
| 20 | 
         
             
              "num_attention_heads": 32,
         
     | 
| 21 | 
         
             
              "num_hidden_layers": 32,
         
     | 
| 
         @@ -24,12 +24,115 @@ 
     | 
|
| 24 | 
         
             
              "pad_token_id": 32000,
         
     | 
| 25 | 
         
             
              "resid_pdrop": 0.0,
         
     | 
| 26 | 
         
             
              "rms_norm_eps": 1e-05,
         
     | 
| 27 | 
         
            -
              "rope_scaling":  
     | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 28 | 
         
             
              "rope_theta": 10000.0,
         
     | 
| 29 | 
         
            -
              "sliding_window":  
     | 
| 30 | 
         
             
              "tie_word_embeddings": false,
         
     | 
| 31 | 
         
             
              "torch_dtype": "bfloat16",
         
     | 
| 32 | 
         
            -
              "transformers_version": "4. 
     | 
| 33 | 
         
             
              "use_cache": true,
         
     | 
| 
         | 
|
| 34 | 
         
             
              "vocab_size": 32064
         
     | 
| 35 | 
         
             
            }
         
     | 
| 
         | 
|
| 1 | 
         
             
            {
         
     | 
| 2 | 
         
            +
              "_name_or_path": "Phi-3.5-mini-instruct",
         
     | 
| 3 | 
         
             
              "architectures": [
         
     | 
| 4 | 
         
             
                "Phi3ForCausalLM"
         
     | 
| 5 | 
         
             
              ],
         
     | 
| 
         | 
|
| 15 | 
         
             
              "hidden_size": 3072,
         
     | 
| 16 | 
         
             
              "initializer_range": 0.02,
         
     | 
| 17 | 
         
             
              "intermediate_size": 8192,
         
     | 
| 18 | 
         
            +
              "max_position_embeddings": 131072,
         
     | 
| 19 | 
         
             
              "model_type": "phi3",
         
     | 
| 20 | 
         
             
              "num_attention_heads": 32,
         
     | 
| 21 | 
         
             
              "num_hidden_layers": 32,
         
     | 
| 
         | 
|
| 24 | 
         
             
              "pad_token_id": 32000,
         
     | 
| 25 | 
         
             
              "resid_pdrop": 0.0,
         
     | 
| 26 | 
         
             
              "rms_norm_eps": 1e-05,
         
     | 
| 27 | 
         
            +
              "rope_scaling": {
         
     | 
| 28 | 
         
            +
                "long_factor": [
         
     | 
| 29 | 
         
            +
                  1.0800000429153442,
         
     | 
| 30 | 
         
            +
                  1.1100000143051147,
         
     | 
| 31 | 
         
            +
                  1.1399999856948853,
         
     | 
| 32 | 
         
            +
                  1.340000033378601,
         
     | 
| 33 | 
         
            +
                  1.5899999141693115,
         
     | 
| 34 | 
         
            +
                  1.600000023841858,
         
     | 
| 35 | 
         
            +
                  1.6200000047683716,
         
     | 
| 36 | 
         
            +
                  2.620000123977661,
         
     | 
| 37 | 
         
            +
                  3.2300000190734863,
         
     | 
| 38 | 
         
            +
                  3.2300000190734863,
         
     | 
| 39 | 
         
            +
                  4.789999961853027,
         
     | 
| 40 | 
         
            +
                  7.400000095367432,
         
     | 
| 41 | 
         
            +
                  7.700000286102295,
         
     | 
| 42 | 
         
            +
                  9.09000015258789,
         
     | 
| 43 | 
         
            +
                  12.199999809265137,
         
     | 
| 44 | 
         
            +
                  17.670000076293945,
         
     | 
| 45 | 
         
            +
                  24.46000099182129,
         
     | 
| 46 | 
         
            +
                  28.57000160217285,
         
     | 
| 47 | 
         
            +
                  30.420001983642578,
         
     | 
| 48 | 
         
            +
                  30.840002059936523,
         
     | 
| 49 | 
         
            +
                  32.590003967285156,
         
     | 
| 50 | 
         
            +
                  32.93000411987305,
         
     | 
| 51 | 
         
            +
                  42.320003509521484,
         
     | 
| 52 | 
         
            +
                  44.96000289916992,
         
     | 
| 53 | 
         
            +
                  50.340003967285156,
         
     | 
| 54 | 
         
            +
                  50.45000457763672,
         
     | 
| 55 | 
         
            +
                  57.55000305175781,
         
     | 
| 56 | 
         
            +
                  57.93000411987305,
         
     | 
| 57 | 
         
            +
                  58.21000289916992,
         
     | 
| 58 | 
         
            +
                  60.1400032043457,
         
     | 
| 59 | 
         
            +
                  62.61000442504883,
         
     | 
| 60 | 
         
            +
                  62.62000274658203,
         
     | 
| 61 | 
         
            +
                  62.71000289916992,
         
     | 
| 62 | 
         
            +
                  63.1400032043457,
         
     | 
| 63 | 
         
            +
                  63.1400032043457,
         
     | 
| 64 | 
         
            +
                  63.77000427246094,
         
     | 
| 65 | 
         
            +
                  63.93000411987305,
         
     | 
| 66 | 
         
            +
                  63.96000289916992,
         
     | 
| 67 | 
         
            +
                  63.970001220703125,
         
     | 
| 68 | 
         
            +
                  64.02999877929688,
         
     | 
| 69 | 
         
            +
                  64.06999969482422,
         
     | 
| 70 | 
         
            +
                  64.08000183105469,
         
     | 
| 71 | 
         
            +
                  64.12000274658203,
         
     | 
| 72 | 
         
            +
                  64.41000366210938,
         
     | 
| 73 | 
         
            +
                  64.4800033569336,
         
     | 
| 74 | 
         
            +
                  64.51000213623047,
         
     | 
| 75 | 
         
            +
                  64.52999877929688,
         
     | 
| 76 | 
         
            +
                  64.83999633789062
         
     | 
| 77 | 
         
            +
                ],
         
     | 
| 78 | 
         
            +
                "short_factor": [
         
     | 
| 79 | 
         
            +
                  1.0,
         
     | 
| 80 | 
         
            +
                  1.0199999809265137,
         
     | 
| 81 | 
         
            +
                  1.0299999713897705,
         
     | 
| 82 | 
         
            +
                  1.0299999713897705,
         
     | 
| 83 | 
         
            +
                  1.0499999523162842,
         
     | 
| 84 | 
         
            +
                  1.0499999523162842,
         
     | 
| 85 | 
         
            +
                  1.0499999523162842,
         
     | 
| 86 | 
         
            +
                  1.0499999523162842,
         
     | 
| 87 | 
         
            +
                  1.0499999523162842,
         
     | 
| 88 | 
         
            +
                  1.0699999332427979,
         
     | 
| 89 | 
         
            +
                  1.0999999046325684,
         
     | 
| 90 | 
         
            +
                  1.1099998950958252,
         
     | 
| 91 | 
         
            +
                  1.1599998474121094,
         
     | 
| 92 | 
         
            +
                  1.1599998474121094,
         
     | 
| 93 | 
         
            +
                  1.1699998378753662,
         
     | 
| 94 | 
         
            +
                  1.2899998426437378,
         
     | 
| 95 | 
         
            +
                  1.339999794960022,
         
     | 
| 96 | 
         
            +
                  1.679999828338623,
         
     | 
| 97 | 
         
            +
                  1.7899998426437378,
         
     | 
| 98 | 
         
            +
                  1.8199998140335083,
         
     | 
| 99 | 
         
            +
                  1.8499997854232788,
         
     | 
| 100 | 
         
            +
                  1.8799997568130493,
         
     | 
| 101 | 
         
            +
                  1.9099997282028198,
         
     | 
| 102 | 
         
            +
                  1.9399996995925903,
         
     | 
| 103 | 
         
            +
                  1.9899996519088745,
         
     | 
| 104 | 
         
            +
                  2.0199997425079346,
         
     | 
| 105 | 
         
            +
                  2.0199997425079346,
         
     | 
| 106 | 
         
            +
                  2.0199997425079346,
         
     | 
| 107 | 
         
            +
                  2.0199997425079346,
         
     | 
| 108 | 
         
            +
                  2.0199997425079346,
         
     | 
| 109 | 
         
            +
                  2.0199997425079346,
         
     | 
| 110 | 
         
            +
                  2.0299997329711914,
         
     | 
| 111 | 
         
            +
                  2.0299997329711914,
         
     | 
| 112 | 
         
            +
                  2.0299997329711914,
         
     | 
| 113 | 
         
            +
                  2.0299997329711914,
         
     | 
| 114 | 
         
            +
                  2.0299997329711914,
         
     | 
| 115 | 
         
            +
                  2.0299997329711914,
         
     | 
| 116 | 
         
            +
                  2.0299997329711914,
         
     | 
| 117 | 
         
            +
                  2.0299997329711914,
         
     | 
| 118 | 
         
            +
                  2.0299997329711914,
         
     | 
| 119 | 
         
            +
                  2.0799996852874756,
         
     | 
| 120 | 
         
            +
                  2.0899996757507324,
         
     | 
| 121 | 
         
            +
                  2.189999580383301,
         
     | 
| 122 | 
         
            +
                  2.2199995517730713,
         
     | 
| 123 | 
         
            +
                  2.5899994373321533,
         
     | 
| 124 | 
         
            +
                  2.729999542236328,
         
     | 
| 125 | 
         
            +
                  2.749999523162842,
         
     | 
| 126 | 
         
            +
                  2.8399994373321533
         
     | 
| 127 | 
         
            +
                ],
         
     | 
| 128 | 
         
            +
                "type": "longrope"
         
     | 
| 129 | 
         
            +
              },
         
     | 
| 130 | 
         
             
              "rope_theta": 10000.0,
         
     | 
| 131 | 
         
            +
              "sliding_window": 262144,
         
     | 
| 132 | 
         
             
              "tie_word_embeddings": false,
         
     | 
| 133 | 
         
             
              "torch_dtype": "bfloat16",
         
     | 
| 134 | 
         
            +
              "transformers_version": "4.43.3",
         
     | 
| 135 | 
         
             
              "use_cache": true,
         
     | 
| 136 | 
         
            +
              "attention_bias": false,
         
     | 
| 137 | 
         
             
              "vocab_size": 32064
         
     | 
| 138 | 
         
             
            }
         
     | 
    	
        configuration_phi3.py
    CHANGED
    
    | 
         @@ -1,213 +1,227 @@ 
     | 
|
| 1 | 
         
            -
            # coding=utf-8
         
     | 
| 2 | 
         
            -
            # Copyright 2024 Microsoft and the HuggingFace Inc. team. All rights reserved.
         
     | 
| 3 | 
         
            -
            #
         
     | 
| 4 | 
         
            -
            # Licensed under the Apache License, Version 2.0 (the "License");
         
     | 
| 5 | 
         
            -
            # you may not use this file except in compliance with the License.
         
     | 
| 6 | 
         
            -
            # You may obtain a copy of the License at
         
     | 
| 7 | 
         
            -
            #
         
     | 
| 8 | 
         
            -
            #     http://www.apache.org/licenses/LICENSE-2.0
         
     | 
| 9 | 
         
            -
            #
         
     | 
| 10 | 
         
            -
            # Unless required by applicable law or agreed to in writing, software
         
     | 
| 11 | 
         
            -
            # distributed under the License is distributed on an "AS IS" BASIS,
         
     | 
| 12 | 
         
            -
            # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
         
     | 
| 13 | 
         
            -
            # See the License for the specific language governing permissions and
         
     | 
| 14 | 
         
            -
            # limitations under the License.
         
     | 
| 15 | 
         
            -
             
     | 
| 16 | 
         
            -
            """ Phi-3 model configuration"""
         
     | 
| 17 | 
         
            -
             
     | 
| 18 | 
         
            -
             
     | 
| 19 | 
         
            -
            from transformers.configuration_utils import PretrainedConfig
         
     | 
| 20 | 
         
            -
            from transformers.utils import logging
         
     | 
| 21 | 
         
            -
             
     | 
| 22 | 
         
            -
             
     | 
| 23 | 
         
            -
            logger = logging.get_logger(__name__)
         
     | 
| 24 | 
         
            -
             
     | 
| 25 | 
         
            -
            PHI3_PRETRAINED_CONFIG_ARCHIVE_MAP = {
         
     | 
| 26 | 
         
            -
                "microsoft/Phi-3-mini-4k-instruct": "https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/resolve/main/config.json",
         
     | 
| 27 | 
         
            -
                "microsoft/Phi-3-mini-128k-instruct": "https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/config.json",
         
     | 
| 28 | 
         
            -
            }
         
     | 
| 29 | 
         
            -
             
     | 
| 30 | 
         
            -
             
     | 
| 31 | 
         
            -
            class Phi3Config(PretrainedConfig):
         
     | 
| 32 | 
         
            -
                r"""
         
     | 
| 33 | 
         
            -
                This is the configuration class to store the configuration of a [`Phi3Model`]. It is used to instantiate a Phi-3
         
     | 
| 34 | 
         
            -
                model according to the specified arguments, defining the model architecture. Instantiating a configuration with the
         
     | 
| 35 | 
         
            -
                defaults will yield a similar configuration to that of the
         
     | 
| 36 | 
         
            -
                [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct).
         
     | 
| 37 | 
         
            -
             
     | 
| 38 | 
         
            -
                Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the
         
     | 
| 39 | 
         
            -
                documentation from [`PretrainedConfig`] for more information.
         
     | 
| 40 | 
         
            -
             
     | 
| 41 | 
         
            -
                Args:
         
     | 
| 42 | 
         
            -
                    vocab_size (`int`, *optional*, defaults to 32064):
         
     | 
| 43 | 
         
            -
                        Vocabulary size of the Phi-3 model. Defines the number of different tokens that can be represented by the
         
     | 
| 44 | 
         
            -
                        `inputs_ids` passed when calling [`Phi3Model`].
         
     | 
| 45 | 
         
            -
                    hidden_size (`int`, *optional*, defaults to 3072):
         
     | 
| 46 | 
         
            -
                        Dimension of the hidden representations.
         
     | 
| 47 | 
         
            -
                    intermediate_size (`int`, *optional*, defaults to 8192):
         
     | 
| 48 | 
         
            -
                        Dimension of the MLP representations.
         
     | 
| 49 | 
         
            -
                    num_hidden_layers (`int`, *optional*, defaults to 32):
         
     | 
| 50 | 
         
            -
                        Number of hidden layers in the Transformer decoder.
         
     | 
| 51 | 
         
            -
                    num_attention_heads (`int`, *optional*, defaults to 32):
         
     | 
| 52 | 
         
            -
                        Number of attention heads for each attention layer in the Transformer decoder.
         
     | 
| 53 | 
         
            -
                    num_key_value_heads (`int`, *optional*):
         
     | 
| 54 | 
         
            -
                        This is the number of key_value heads that should be used to implement Grouped Query Attention. If
         
     | 
| 55 | 
         
            -
                        `num_key_value_heads=num_attention_heads`, the model will use Multi Head Attention (MHA), if
         
     | 
| 56 | 
         
            -
                        `num_key_value_heads=1 the model will use Multi Query Attention (MQA) otherwise GQA is used. When
         
     | 
| 57 | 
         
            -
                        converting a multi-head checkpoint to a GQA checkpoint, each group key and value head should be constructed
         
     | 
| 58 | 
         
            -
                        by meanpooling all the original heads within that group. For more details checkout [this
         
     | 
| 59 | 
         
            -
                        paper](https://arxiv.org/pdf/2305.13245.pdf). If it is not specified, will default to
         
     | 
| 60 | 
         
            -
                        `num_attention_heads`.
         
     | 
| 61 | 
         
            -
                    resid_pdrop (`float`, *optional*, defaults to 0.0):
         
     | 
| 62 | 
         
            -
                        Dropout probability for mlp outputs.
         
     | 
| 63 | 
         
            -
                    embd_pdrop (`int`, *optional*, defaults to 0.0):
         
     | 
| 64 | 
         
            -
                        The dropout ratio for the embeddings.
         
     | 
| 65 | 
         
            -
                    attention_dropout (`float`, *optional*, defaults to 0.0):
         
     | 
| 66 | 
         
            -
                        The dropout ratio after computing the attention scores.
         
     | 
| 67 | 
         
            -
                    hidden_act (`str` or `function`, *optional*, defaults to `"silu"`):
         
     | 
| 68 | 
         
            -
                        The non-linear activation function (function or string) in the decoder.
         
     | 
| 69 | 
         
            -
                    max_position_embeddings (`int`, *optional*, defaults to 4096):
         
     | 
| 70 | 
         
            -
                        The maximum sequence length that this model might ever be used with.
         
     | 
| 71 | 
         
            -
                    original_max_position_embeddings (`int`, *optional*, defaults to 4096):
         
     | 
| 72 | 
         
            -
                        The maximum sequence length that this model was trained with. This is used to determine the size of the
         
     | 
| 73 | 
         
            -
                        original RoPE embeddings when using long scaling.
         
     | 
| 74 | 
         
            -
                    initializer_range (`float`, *optional*, defaults to 0.02):
         
     | 
| 75 | 
         
            -
                        The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
         
     | 
| 76 | 
         
            -
                    rms_norm_eps (`float`, *optional*, defaults to 1e-05):
         
     | 
| 77 | 
         
            -
                        The epsilon value used for the RMSNorm.
         
     | 
| 78 | 
         
            -
                    use_cache (`bool`, *optional*, defaults to `True`):
         
     | 
| 79 | 
         
            -
                        Whether or not the model should return the last key/values attentions (not used by all models). Only
         
     | 
| 80 | 
         
            -
                        relevant if `config.is_decoder=True`. Whether to tie weight embeddings or not.
         
     | 
| 81 | 
         
            -
                    tie_word_embeddings (`bool`, *optional*, defaults to `False`):
         
     | 
| 82 | 
         
            -
                        Whether to tie weight embeddings
         
     | 
| 83 | 
         
            -
                    rope_theta (`float`, *optional*, defaults to 10000.0):
         
     | 
| 84 | 
         
            -
                        The base period of the RoPE embeddings.
         
     | 
| 85 | 
         
            -
                    rope_scaling (`dict`, *optional*):
         
     | 
| 86 | 
         
            -
                        The scaling strategy for the RoPE embeddings. If `None`, no scaling is applied. If a dictionary, it must
         
     | 
| 87 | 
         
            -
                        contain the following keys: `type`, `short_factor` and `long_factor`. The `type` must be  
     | 
| 88 | 
         
            -
                        the `short_factor` and `long_factor` must be lists of numbers with the same length as the hidden size
         
     | 
| 89 | 
         
            -
                        divided by the number of attention heads divided by 2.
         
     | 
| 90 | 
         
            -
                    bos_token_id (`int`, *optional*, defaults to 1):
         
     | 
| 91 | 
         
            -
                        The id of the "beginning-of-sequence" token.
         
     | 
| 92 | 
         
            -
                    eos_token_id (`int`, *optional*, defaults to 32000):
         
     | 
| 93 | 
         
            -
                        The id of the "end-of-sequence" token.
         
     | 
| 94 | 
         
            -
                    pad_token_id (`int`, *optional*, defaults to 32000):
         
     | 
| 95 | 
         
            -
                        The id of the padding token.
         
     | 
| 96 | 
         
            -
                    sliding_window (`int`, *optional*):
         
     | 
| 97 | 
         
            -
                        Sliding window attention window size. If `None`, no sliding window is applied.
         
     | 
| 98 | 
         
            -
             
     | 
| 99 | 
         
            -
                Example:
         
     | 
| 100 | 
         
            -
             
     | 
| 101 | 
         
            -
                ```python
         
     | 
| 102 | 
         
            -
                >>> from transformers import Phi3Model, Phi3Config
         
     | 
| 103 | 
         
            -
             
     | 
| 104 | 
         
            -
                >>> # Initializing a Phi-3 style configuration
         
     | 
| 105 | 
         
            -
                >>> configuration = Phi3Config.from_pretrained("microsoft/Phi-3-mini-4k-instruct")
         
     | 
| 106 | 
         
            -
             
     | 
| 107 | 
         
            -
                >>> # Initializing a model from the configuration
         
     | 
| 108 | 
         
            -
                >>> model = Phi3Model(configuration)
         
     | 
| 109 | 
         
            -
             
     | 
| 110 | 
         
            -
                >>> # Accessing the model configuration
         
     | 
| 111 | 
         
            -
                >>> configuration = model.config
         
     | 
| 112 | 
         
            -
                ```"""
         
     | 
| 113 | 
         
            -
             
     | 
| 114 | 
         
            -
                model_type = "phi3"
         
     | 
| 115 | 
         
            -
                keys_to_ignore_at_inference = ["past_key_values"]
         
     | 
| 116 | 
         
            -
             
     | 
| 117 | 
         
            -
                def __init__(
         
     | 
| 118 | 
         
            -
                    self,
         
     | 
| 119 | 
         
            -
                    vocab_size=32064,
         
     | 
| 120 | 
         
            -
                    hidden_size=3072,
         
     | 
| 121 | 
         
            -
                    intermediate_size=8192,
         
     | 
| 122 | 
         
            -
                    num_hidden_layers=32,
         
     | 
| 123 | 
         
            -
                    num_attention_heads=32,
         
     | 
| 124 | 
         
            -
                    num_key_value_heads=None,
         
     | 
| 125 | 
         
            -
                    resid_pdrop=0.0,
         
     | 
| 126 | 
         
            -
                    embd_pdrop=0.0,
         
     | 
| 127 | 
         
            -
                    attention_dropout=0.0,
         
     | 
| 128 | 
         
            -
                    hidden_act="silu",
         
     | 
| 129 | 
         
            -
                    max_position_embeddings=4096,
         
     | 
| 130 | 
         
            -
                    original_max_position_embeddings=4096,
         
     | 
| 131 | 
         
            -
                    initializer_range=0.02,
         
     | 
| 132 | 
         
            -
                    rms_norm_eps=1e-5,
         
     | 
| 133 | 
         
            -
                    use_cache=True,
         
     | 
| 134 | 
         
            -
                    tie_word_embeddings=False,
         
     | 
| 135 | 
         
            -
                    rope_theta=10000.0,
         
     | 
| 136 | 
         
            -
                    rope_scaling=None,
         
     | 
| 137 | 
         
            -
                    bos_token_id=1,
         
     | 
| 138 | 
         
            -
                    eos_token_id=32000,
         
     | 
| 139 | 
         
            -
                    pad_token_id=32000,
         
     | 
| 140 | 
         
            -
                    sliding_window=None,
         
     | 
| 141 | 
         
            -
                    **kwargs,
         
     | 
| 142 | 
         
            -
                ):
         
     | 
| 143 | 
         
            -
                    self.vocab_size = vocab_size
         
     | 
| 144 | 
         
            -
                    self.hidden_size = hidden_size
         
     | 
| 145 | 
         
            -
                    self.intermediate_size = intermediate_size
         
     | 
| 146 | 
         
            -
                    self.num_hidden_layers = num_hidden_layers
         
     | 
| 147 | 
         
            -
                    self.num_attention_heads = num_attention_heads
         
     | 
| 148 | 
         
            -
             
     | 
| 149 | 
         
            -
                    if num_key_value_heads is None:
         
     | 
| 150 | 
         
            -
                        num_key_value_heads = num_attention_heads
         
     | 
| 151 | 
         
            -
             
     | 
| 152 | 
         
            -
                    self.num_key_value_heads = num_key_value_heads
         
     | 
| 153 | 
         
            -
                    self.resid_pdrop = resid_pdrop
         
     | 
| 154 | 
         
            -
                    self.embd_pdrop = embd_pdrop
         
     | 
| 155 | 
         
            -
                    self.attention_dropout = attention_dropout
         
     | 
| 156 | 
         
            -
                    self.hidden_act = hidden_act
         
     | 
| 157 | 
         
            -
                    self.max_position_embeddings = max_position_embeddings
         
     | 
| 158 | 
         
            -
                    self.original_max_position_embeddings = original_max_position_embeddings
         
     | 
| 159 | 
         
            -
                    self.initializer_range = initializer_range
         
     | 
| 160 | 
         
            -
                    self.rms_norm_eps = rms_norm_eps
         
     | 
| 161 | 
         
            -
                    self.use_cache = use_cache
         
     | 
| 162 | 
         
            -
                    self.rope_theta = rope_theta
         
     | 
| 163 | 
         
            -
                    self.rope_scaling = rope_scaling
         
     | 
| 164 | 
         
            -
                    self. 
     | 
| 165 | 
         
            -
                    self. 
     | 
| 166 | 
         
            -
             
     | 
| 167 | 
         
            -
             
     | 
| 168 | 
         
            -
             
     | 
| 169 | 
         
            -
                         
     | 
| 170 | 
         
            -
                         
     | 
| 171 | 
         
            -
                         
     | 
| 172 | 
         
            -
                         
     | 
| 173 | 
         
            -
             
     | 
| 174 | 
         
            -
             
     | 
| 175 | 
         
            -
             
     | 
| 176 | 
         
            -
             
     | 
| 177 | 
         
            -
                     
     | 
| 178 | 
         
            -
                     
     | 
| 179 | 
         
            -
                     
     | 
| 180 | 
         
            -
             
     | 
| 181 | 
         
            -
             
     | 
| 182 | 
         
            -
             
     | 
| 183 | 
         
            -
             
     | 
| 184 | 
         
            -
             
     | 
| 185 | 
         
            -
             
     | 
| 186 | 
         
            -
             
     | 
| 187 | 
         
            -
             
     | 
| 188 | 
         
            -
             
     | 
| 189 | 
         
            -
             
     | 
| 190 | 
         
            -
                     
     | 
| 191 | 
         
            -
             
     | 
| 192 | 
         
            -
                     
     | 
| 193 | 
         
            -
             
     | 
| 194 | 
         
            -
                         
     | 
| 195 | 
         
            -
             
     | 
| 196 | 
         
            -
             
     | 
| 197 | 
         
            -
             
     | 
| 198 | 
         
            -
             
     | 
| 199 | 
         
            -
             
     | 
| 200 | 
         
            -
                         
     | 
| 201 | 
         
            -
             
     | 
| 202 | 
         
            -
             
     | 
| 203 | 
         
            -
                     
     | 
| 204 | 
         
            -
             
     | 
| 205 | 
         
            -
                         
     | 
| 206 | 
         
            -
                     
     | 
| 207 | 
         
            -
                         
     | 
| 208 | 
         
            -
             
     | 
| 209 | 
         
            -
             
     | 
| 210 | 
         
            -
             
     | 
| 211 | 
         
            -
             
     | 
| 212 | 
         
            -
             
     | 
| 213 | 
         
            -
             
     | 
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
| 
         | 
|
| 1 | 
         
            +
            # coding=utf-8
         
     | 
| 2 | 
         
            +
            # Copyright 2024 Microsoft and the HuggingFace Inc. team. All rights reserved.
         
     | 
| 3 | 
         
            +
            #
         
     | 
| 4 | 
         
            +
            # Licensed under the Apache License, Version 2.0 (the "License");
         
     | 
| 5 | 
         
            +
            # you may not use this file except in compliance with the License.
         
     | 
| 6 | 
         
            +
            # You may obtain a copy of the License at
         
     | 
| 7 | 
         
            +
            #
         
     | 
| 8 | 
         
            +
            #     http://www.apache.org/licenses/LICENSE-2.0
         
     | 
| 9 | 
         
            +
            #
         
     | 
| 10 | 
         
            +
            # Unless required by applicable law or agreed to in writing, software
         
     | 
| 11 | 
         
            +
            # distributed under the License is distributed on an "AS IS" BASIS,
         
     | 
| 12 | 
         
            +
            # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
         
     | 
| 13 | 
         
            +
            # See the License for the specific language governing permissions and
         
     | 
| 14 | 
         
            +
            # limitations under the License.
         
     | 
| 15 | 
         
            +
             
     | 
| 16 | 
         
            +
            """ Phi-3 model configuration"""
         
     | 
| 17 | 
         
            +
             
     | 
| 18 | 
         
            +
             
     | 
| 19 | 
         
            +
            from transformers.configuration_utils import PretrainedConfig
         
     | 
| 20 | 
         
            +
            from transformers.utils import logging
         
     | 
| 21 | 
         
            +
             
     | 
| 22 | 
         
            +
             
     | 
| 23 | 
         
            +
            logger = logging.get_logger(__name__)
         
     | 
| 24 | 
         
            +
             
     | 
| 25 | 
         
            +
            PHI3_PRETRAINED_CONFIG_ARCHIVE_MAP = {
         
     | 
| 26 | 
         
            +
                "microsoft/Phi-3-mini-4k-instruct": "https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/resolve/main/config.json",
         
     | 
| 27 | 
         
            +
                "microsoft/Phi-3-mini-128k-instruct": "https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/config.json",
         
     | 
| 28 | 
         
            +
            }
         
     | 
| 29 | 
         
            +
             
     | 
| 30 | 
         
            +
             
     | 
| 31 | 
         
            +
            class Phi3Config(PretrainedConfig):
         
     | 
| 32 | 
         
            +
                r"""
         
     | 
| 33 | 
         
            +
                This is the configuration class to store the configuration of a [`Phi3Model`]. It is used to instantiate a Phi-3
         
     | 
| 34 | 
         
            +
                model according to the specified arguments, defining the model architecture. Instantiating a configuration with the
         
     | 
| 35 | 
         
            +
                defaults will yield a similar configuration to that of the
         
     | 
| 36 | 
         
            +
                [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct).
         
     | 
| 37 | 
         
            +
             
     | 
| 38 | 
         
            +
                Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the
         
     | 
| 39 | 
         
            +
                documentation from [`PretrainedConfig`] for more information.
         
     | 
| 40 | 
         
            +
             
     | 
| 41 | 
         
            +
                Args:
         
     | 
| 42 | 
         
            +
                    vocab_size (`int`, *optional*, defaults to 32064):
         
     | 
| 43 | 
         
            +
                        Vocabulary size of the Phi-3 model. Defines the number of different tokens that can be represented by the
         
     | 
| 44 | 
         
            +
                        `inputs_ids` passed when calling [`Phi3Model`].
         
     | 
| 45 | 
         
            +
                    hidden_size (`int`, *optional*, defaults to 3072):
         
     | 
| 46 | 
         
            +
                        Dimension of the hidden representations.
         
     | 
| 47 | 
         
            +
                    intermediate_size (`int`, *optional*, defaults to 8192):
         
     | 
| 48 | 
         
            +
                        Dimension of the MLP representations.
         
     | 
| 49 | 
         
            +
                    num_hidden_layers (`int`, *optional*, defaults to 32):
         
     | 
| 50 | 
         
            +
                        Number of hidden layers in the Transformer decoder.
         
     | 
| 51 | 
         
            +
                    num_attention_heads (`int`, *optional*, defaults to 32):
         
     | 
| 52 | 
         
            +
                        Number of attention heads for each attention layer in the Transformer decoder.
         
     | 
| 53 | 
         
            +
                    num_key_value_heads (`int`, *optional*):
         
     | 
| 54 | 
         
            +
                        This is the number of key_value heads that should be used to implement Grouped Query Attention. If
         
     | 
| 55 | 
         
            +
                        `num_key_value_heads=num_attention_heads`, the model will use Multi Head Attention (MHA), if
         
     | 
| 56 | 
         
            +
                        `num_key_value_heads=1 the model will use Multi Query Attention (MQA) otherwise GQA is used. When
         
     | 
| 57 | 
         
            +
                        converting a multi-head checkpoint to a GQA checkpoint, each group key and value head should be constructed
         
     | 
| 58 | 
         
            +
                        by meanpooling all the original heads within that group. For more details checkout [this
         
     | 
| 59 | 
         
            +
                        paper](https://arxiv.org/pdf/2305.13245.pdf). If it is not specified, will default to
         
     | 
| 60 | 
         
            +
                        `num_attention_heads`.
         
     | 
| 61 | 
         
            +
                    resid_pdrop (`float`, *optional*, defaults to 0.0):
         
     | 
| 62 | 
         
            +
                        Dropout probability for mlp outputs.
         
     | 
| 63 | 
         
            +
                    embd_pdrop (`int`, *optional*, defaults to 0.0):
         
     | 
| 64 | 
         
            +
                        The dropout ratio for the embeddings.
         
     | 
| 65 | 
         
            +
                    attention_dropout (`float`, *optional*, defaults to 0.0):
         
     | 
| 66 | 
         
            +
                        The dropout ratio after computing the attention scores.
         
     | 
| 67 | 
         
            +
                    hidden_act (`str` or `function`, *optional*, defaults to `"silu"`):
         
     | 
| 68 | 
         
            +
                        The non-linear activation function (function or string) in the decoder.
         
     | 
| 69 | 
         
            +
                    max_position_embeddings (`int`, *optional*, defaults to 4096):
         
     | 
| 70 | 
         
            +
                        The maximum sequence length that this model might ever be used with.
         
     | 
| 71 | 
         
            +
                    original_max_position_embeddings (`int`, *optional*, defaults to 4096):
         
     | 
| 72 | 
         
            +
                        The maximum sequence length that this model was trained with. This is used to determine the size of the
         
     | 
| 73 | 
         
            +
                        original RoPE embeddings when using long scaling.
         
     | 
| 74 | 
         
            +
                    initializer_range (`float`, *optional*, defaults to 0.02):
         
     | 
| 75 | 
         
            +
                        The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
         
     | 
| 76 | 
         
            +
                    rms_norm_eps (`float`, *optional*, defaults to 1e-05):
         
     | 
| 77 | 
         
            +
                        The epsilon value used for the RMSNorm.
         
     | 
| 78 | 
         
            +
                    use_cache (`bool`, *optional*, defaults to `True`):
         
     | 
| 79 | 
         
            +
                        Whether or not the model should return the last key/values attentions (not used by all models). Only
         
     | 
| 80 | 
         
            +
                        relevant if `config.is_decoder=True`. Whether to tie weight embeddings or not.
         
     | 
| 81 | 
         
            +
                    tie_word_embeddings (`bool`, *optional*, defaults to `False`):
         
     | 
| 82 | 
         
            +
                        Whether to tie weight embeddings
         
     | 
| 83 | 
         
            +
                    rope_theta (`float`, *optional*, defaults to 10000.0):
         
     | 
| 84 | 
         
            +
                        The base period of the RoPE embeddings.
         
     | 
| 85 | 
         
            +
                    rope_scaling (`dict`, *optional*):
         
     | 
| 86 | 
         
            +
                        The scaling strategy for the RoPE embeddings. If `None`, no scaling is applied. If a dictionary, it must
         
     | 
| 87 | 
         
            +
                        contain the following keys: `type`, `short_factor` and `long_factor`. The `type` must be `longrope` and 
         
     | 
| 88 | 
         
            +
                        the `short_factor` and `long_factor` must be lists of numbers with the same length as the hidden size 
         
     | 
| 89 | 
         
            +
                        divided by the number of attention heads divided by 2.
         
     | 
| 90 | 
         
            +
                    bos_token_id (`int`, *optional*, defaults to 1):
         
     | 
| 91 | 
         
            +
                        The id of the "beginning-of-sequence" token.
         
     | 
| 92 | 
         
            +
                    eos_token_id (`int`, *optional*, defaults to 32000):
         
     | 
| 93 | 
         
            +
                        The id of the "end-of-sequence" token.
         
     | 
| 94 | 
         
            +
                    pad_token_id (`int`, *optional*, defaults to 32000):
         
     | 
| 95 | 
         
            +
                        The id of the padding token.
         
     | 
| 96 | 
         
            +
                    sliding_window (`int`, *optional*):
         
     | 
| 97 | 
         
            +
                        Sliding window attention window size. If `None`, no sliding window is applied.
         
     | 
| 98 | 
         
            +
             
     | 
| 99 | 
         
            +
                Example:
         
     | 
| 100 | 
         
            +
             
     | 
| 101 | 
         
            +
                ```python
         
     | 
| 102 | 
         
            +
                >>> from transformers import Phi3Model, Phi3Config
         
     | 
| 103 | 
         
            +
             
     | 
| 104 | 
         
            +
                >>> # Initializing a Phi-3 style configuration
         
     | 
| 105 | 
         
            +
                >>> configuration = Phi3Config.from_pretrained("microsoft/Phi-3-mini-4k-instruct")
         
     | 
| 106 | 
         
            +
             
     | 
| 107 | 
         
            +
                >>> # Initializing a model from the configuration
         
     | 
| 108 | 
         
            +
                >>> model = Phi3Model(configuration)
         
     | 
| 109 | 
         
            +
             
     | 
| 110 | 
         
            +
                >>> # Accessing the model configuration
         
     | 
| 111 | 
         
            +
                >>> configuration = model.config
         
     | 
| 112 | 
         
            +
                ```"""
         
     | 
| 113 | 
         
            +
             
     | 
| 114 | 
         
            +
                model_type = "phi3"
         
     | 
| 115 | 
         
            +
                keys_to_ignore_at_inference = ["past_key_values"]
         
     | 
| 116 | 
         
            +
             
     | 
| 117 | 
         
            +
                def __init__(
         
     | 
| 118 | 
         
            +
                    self,
         
     | 
| 119 | 
         
            +
                    vocab_size=32064,
         
     | 
| 120 | 
         
            +
                    hidden_size=3072,
         
     | 
| 121 | 
         
            +
                    intermediate_size=8192,
         
     | 
| 122 | 
         
            +
                    num_hidden_layers=32,
         
     | 
| 123 | 
         
            +
                    num_attention_heads=32,
         
     | 
| 124 | 
         
            +
                    num_key_value_heads=None,
         
     | 
| 125 | 
         
            +
                    resid_pdrop=0.0,
         
     | 
| 126 | 
         
            +
                    embd_pdrop=0.0,
         
     | 
| 127 | 
         
            +
                    attention_dropout=0.0,
         
     | 
| 128 | 
         
            +
                    hidden_act="silu",
         
     | 
| 129 | 
         
            +
                    max_position_embeddings=4096,
         
     | 
| 130 | 
         
            +
                    original_max_position_embeddings=4096,
         
     | 
| 131 | 
         
            +
                    initializer_range=0.02,
         
     | 
| 132 | 
         
            +
                    rms_norm_eps=1e-5,
         
     | 
| 133 | 
         
            +
                    use_cache=True,
         
     | 
| 134 | 
         
            +
                    tie_word_embeddings=False,
         
     | 
| 135 | 
         
            +
                    rope_theta=10000.0,
         
     | 
| 136 | 
         
            +
                    rope_scaling=None,
         
     | 
| 137 | 
         
            +
                    bos_token_id=1,
         
     | 
| 138 | 
         
            +
                    eos_token_id=32000,
         
     | 
| 139 | 
         
            +
                    pad_token_id=32000,
         
     | 
| 140 | 
         
            +
                    sliding_window=None,
         
     | 
| 141 | 
         
            +
                    **kwargs,
         
     | 
| 142 | 
         
            +
                ):
         
     | 
| 143 | 
         
            +
                    self.vocab_size = vocab_size
         
     | 
| 144 | 
         
            +
                    self.hidden_size = hidden_size
         
     | 
| 145 | 
         
            +
                    self.intermediate_size = intermediate_size
         
     | 
| 146 | 
         
            +
                    self.num_hidden_layers = num_hidden_layers
         
     | 
| 147 | 
         
            +
                    self.num_attention_heads = num_attention_heads
         
     | 
| 148 | 
         
            +
             
     | 
| 149 | 
         
            +
                    if num_key_value_heads is None:
         
     | 
| 150 | 
         
            +
                        num_key_value_heads = num_attention_heads
         
     | 
| 151 | 
         
            +
             
     | 
| 152 | 
         
            +
                    self.num_key_value_heads = num_key_value_heads
         
     | 
| 153 | 
         
            +
                    self.resid_pdrop = resid_pdrop
         
     | 
| 154 | 
         
            +
                    self.embd_pdrop = embd_pdrop
         
     | 
| 155 | 
         
            +
                    self.attention_dropout = attention_dropout
         
     | 
| 156 | 
         
            +
                    self.hidden_act = hidden_act
         
     | 
| 157 | 
         
            +
                    self.max_position_embeddings = max_position_embeddings
         
     | 
| 158 | 
         
            +
                    self.original_max_position_embeddings = original_max_position_embeddings
         
     | 
| 159 | 
         
            +
                    self.initializer_range = initializer_range
         
     | 
| 160 | 
         
            +
                    self.rms_norm_eps = rms_norm_eps
         
     | 
| 161 | 
         
            +
                    self.use_cache = use_cache
         
     | 
| 162 | 
         
            +
                    self.rope_theta = rope_theta
         
     | 
| 163 | 
         
            +
                    self.rope_scaling = rope_scaling
         
     | 
| 164 | 
         
            +
                    self._rope_scaling_adjustment()
         
     | 
| 165 | 
         
            +
                    self._rope_scaling_validation()
         
     | 
| 166 | 
         
            +
                    self.sliding_window = sliding_window
         
     | 
| 167 | 
         
            +
             
     | 
| 168 | 
         
            +
                    super().__init__(
         
     | 
| 169 | 
         
            +
                        bos_token_id=bos_token_id,
         
     | 
| 170 | 
         
            +
                        eos_token_id=eos_token_id,
         
     | 
| 171 | 
         
            +
                        pad_token_id=pad_token_id,
         
     | 
| 172 | 
         
            +
                        tie_word_embeddings=tie_word_embeddings,
         
     | 
| 173 | 
         
            +
                        **kwargs,
         
     | 
| 174 | 
         
            +
                    )
         
     | 
| 175 | 
         
            +
             
     | 
| 176 | 
         
            +
                def _rope_scaling_adjustment(self):
         
     | 
| 177 | 
         
            +
                    """
         
     | 
| 178 | 
         
            +
                    Adjust the `type` of the `rope_scaling` configuration for backward compatibility.
         
     | 
| 179 | 
         
            +
                    """
         
     | 
| 180 | 
         
            +
                    if self.rope_scaling is None:
         
     | 
| 181 | 
         
            +
                        return
         
     | 
| 182 | 
         
            +
             
     | 
| 183 | 
         
            +
                    rope_scaling_type = self.rope_scaling.get("type", None)
         
     | 
| 184 | 
         
            +
             
     | 
| 185 | 
         
            +
                    # For backward compatibility if previous version used "su" or "yarn"
         
     | 
| 186 | 
         
            +
                    if rope_scaling_type is not None and rope_scaling_type in ["su", "yarn"]:
         
     | 
| 187 | 
         
            +
                        self.rope_scaling["type"] = "longrope"
         
     | 
| 188 | 
         
            +
             
     | 
| 189 | 
         
            +
                def _rope_scaling_validation(self):
         
     | 
| 190 | 
         
            +
                    """
         
     | 
| 191 | 
         
            +
                    Validate the `rope_scaling` configuration.
         
     | 
| 192 | 
         
            +
                    """
         
     | 
| 193 | 
         
            +
                    if self.rope_scaling is None:
         
     | 
| 194 | 
         
            +
                        return
         
     | 
| 195 | 
         
            +
             
     | 
| 196 | 
         
            +
                    if not isinstance(self.rope_scaling, dict) or len(self.rope_scaling) != 3:
         
     | 
| 197 | 
         
            +
                        raise ValueError(
         
     | 
| 198 | 
         
            +
                            "`rope_scaling` must be a dictionary with three fields, `type`, `short_factor` and `long_factor`, "
         
     | 
| 199 | 
         
            +
                            f"got {self.rope_scaling}"
         
     | 
| 200 | 
         
            +
                        )
         
     | 
| 201 | 
         
            +
                    rope_scaling_type = self.rope_scaling.get("type", None)
         
     | 
| 202 | 
         
            +
                    rope_scaling_short_factor = self.rope_scaling.get("short_factor", None)
         
     | 
| 203 | 
         
            +
                    rope_scaling_long_factor = self.rope_scaling.get("long_factor", None)
         
     | 
| 204 | 
         
            +
                    if rope_scaling_type is None or rope_scaling_type not in ["longrope"]:
         
     | 
| 205 | 
         
            +
                        raise ValueError(f"`rope_scaling`'s type field must be one of ['longrope'], got {rope_scaling_type}")
         
     | 
| 206 | 
         
            +
                    if not (
         
     | 
| 207 | 
         
            +
                        isinstance(rope_scaling_short_factor, list)
         
     | 
| 208 | 
         
            +
                        and all(isinstance(x, (int, float)) for x in rope_scaling_short_factor)
         
     | 
| 209 | 
         
            +
                    ):
         
     | 
| 210 | 
         
            +
                        raise ValueError(
         
     | 
| 211 | 
         
            +
                            f"`rope_scaling`'s short_factor field must be a list of numbers, got {rope_scaling_short_factor}"
         
     | 
| 212 | 
         
            +
                        )
         
     | 
| 213 | 
         
            +
                    if not len(rope_scaling_short_factor) == self.hidden_size // self.num_attention_heads // 2:
         
     | 
| 214 | 
         
            +
                        raise ValueError(
         
     | 
| 215 | 
         
            +
                            f"`rope_scaling`'s short_factor field must have length {self.hidden_size // self.num_attention_heads // 2}, got {len(rope_scaling_short_factor)}"
         
     | 
| 216 | 
         
            +
                        )
         
     | 
| 217 | 
         
            +
                    if not (
         
     | 
| 218 | 
         
            +
                        isinstance(rope_scaling_long_factor, list)
         
     | 
| 219 | 
         
            +
                        and all(isinstance(x, (int, float)) for x in rope_scaling_long_factor)
         
     | 
| 220 | 
         
            +
                    ):
         
     | 
| 221 | 
         
            +
                        raise ValueError(
         
     | 
| 222 | 
         
            +
                            f"`rope_scaling`'s long_factor field must be a list of numbers, got {rope_scaling_long_factor}"
         
     | 
| 223 | 
         
            +
                        )
         
     | 
| 224 | 
         
            +
                    if not len(rope_scaling_long_factor) == self.hidden_size // self.num_attention_heads // 2:
         
     | 
| 225 | 
         
            +
                        raise ValueError(
         
     | 
| 226 | 
         
            +
                            f"`rope_scaling`'s long_factor field must have length {self.hidden_size // self.num_attention_heads // 2}, got {len(rope_scaling_long_factor)}"
         
     | 
| 227 | 
         
            +
                        )
         
     | 
    	
        cuda/cuda-fp16/.azDownload-8f9058cd-a0ce-7c4d-6eea-9a48f67f2ddc-phi-3.5-mini-instruct-cuda-fp16.onnx.data
    DELETED
    
    | 
         
            File without changes
         
     | 
    	
        cuda/cuda-fp16/phi-3.5-mini-instruct-cuda-fp16.onnx.data
    ADDED
    
    | 
         @@ -0,0 +1,3 @@ 
     | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
| 
         | 
|
| 1 | 
         
            +
            version https://git-lfs.github.com/spec/v1
         
     | 
| 2 | 
         
            +
            oid sha256:ec1a077e5ebf1072ef95af5712951af20ce33b003a53717a134b4522b6104067
         
     | 
| 3 | 
         
            +
            size 7642159104
         
     | 
    	
        cuda/cuda-int4-awq-block-128/.azDownload-8f9058cd-a0ce-7c4d-6eea-9a48f67f2ddc-phi-3.5-mini-instruct-cuda-int4-awq-block-128.onnx.data
    DELETED
    
    | 
         
            File without changes
         
     | 
    	
        cuda/cuda-int4-awq-block-128/phi-3.5-mini-instruct-cuda-int4-awq-block-128.onnx.data
    ADDED
    
    | 
         @@ -0,0 +1,3 @@ 
     | 
|
| 
         | 
|
| 
         | 
|
| 
         | 
| 
         | 
|
| 1 | 
         
            +
            version https://git-lfs.github.com/spec/v1
         
     | 
| 2 | 
         
            +
            oid sha256:dfba2f38f25040110fd7beb4475a342af7e4dae6f889a6cb7a52131387a42c95
         
     | 
| 3 | 
         
            +
            size 2277120000
         
     |