change warning background color
Browse files
app.py
CHANGED
@@ -131,8 +131,8 @@ with demo:
|
|
131 |
<br>\
|
132 |
<p>Inspired from the <a href="https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard">๐ค Open LLM Leaderboard</a> and <a href="https://huggingface.co/spaces/optimum/llm-perf-leaderboard">๐ค Open LLM-Perf Leaderboard ๐๏ธ</a>, we compare performance of base multilingual code generation models on <a href="https://huggingface.co/datasets/openai_humaneval">HumanEval</a> benchmark and <a href="https://huggingface.co/datasets/nuprl/MultiPL-E">MultiPL-E</a>. We also measure throughput and provide\
|
133 |
information about the models. We only compare open pre-trained multilingual code models, that people can start from as base models for their trainings.</p>
|
134 |
-
<div style='background-color:
|
135 |
-
<p>Warning
|
136 |
You can also check other code leaderboards like <a href="https://huggingface.co/spaces/mike-ravkine/can-ai-code-results">Can-AI-Code</a> .</p>
|
137 |
</div>""",
|
138 |
elem_classes="markdown-text",
|
|
|
131 |
<br>\
|
132 |
<p>Inspired from the <a href="https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard">๐ค Open LLM Leaderboard</a> and <a href="https://huggingface.co/spaces/optimum/llm-perf-leaderboard">๐ค Open LLM-Perf Leaderboard ๐๏ธ</a>, we compare performance of base multilingual code generation models on <a href="https://huggingface.co/datasets/openai_humaneval">HumanEval</a> benchmark and <a href="https://huggingface.co/datasets/nuprl/MultiPL-E">MultiPL-E</a>. We also measure throughput and provide\
|
133 |
information about the models. We only compare open pre-trained multilingual code models, that people can start from as base models for their trainings.</p>
|
134 |
+
<div style='background-color: #F5F1CB; text-align: center; padding: 10px;'>
|
135 |
+
<p><b>Warning</b>: This leaderboard was last updated as of the release of <a href="https://huggingface.co/deepseek-ai/deepseek-coder-33b-instruct">DeepSeek-Coder-33b-instruct</a> on November 2023. Stronger models might have been released since, check the <b>Submit Results</b> section for submitting new evaluation results for the leaderboard.
|
136 |
You can also check other code leaderboards like <a href="https://huggingface.co/spaces/mike-ravkine/can-ai-code-results">Can-AI-Code</a> .</p>
|
137 |
</div>""",
|
138 |
elem_classes="markdown-text",
|