edgellms / index-back.html
atlury's picture
Upload 21 files
3a76a4e verified
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>llama-cpp-wasm</title>
<link rel="icon" type="image/png" href="favicon.png" />
<!-- picocss -->
<link
rel="stylesheet"
href="https://cdn.jsdelivr.net/npm/@picocss/pico@2/css/pico.min.css"
/>
</head>
<body>
<header class="container">
<hgroup>
<h1><a href="/">llama-cpp-wasm</a></h1>
<br />
<p> WebAssembly (Wasm) Build and Bindings for <a href="https://github.com/ggerganov/llama.cpp" target="_blank">llama.cpp</a>. </p>
<br />
<p> This demonstration enables you to run LLM models directly in your browser utilizing JavaScript, WebAssembly, and llama.cpp. </p>
<br />
<p> Repository: <a href="https://github.com/tangledgroup/llama-cpp-wasm"> https://github.com/tangledgroup/llama-cpp-wasm </a></p>
</hgroup>
</header>
<main class="container">
<section>
<img src="img/run-llama-cpp-in-browser-twitter-fs8.png" alt="llama-cpp-wasm" />
</section>
<hr />
<section>
<h2> In-Browser Demos </h2>
<ul>
<li><a href="/example-single-thread.html"> &#128034; &nbsp; <b> single thread </b> wasm32 </a></li>
<li><a href="/example-multi-thread.html"> &#128007; &nbsp; <b> multithreading </b> wasm32 </a></li>
</ul>
</section>
<hr />
<section>
<h2> Ecosystem </h2>
<article class="component">
<div class="grid">
<div>
<a href="https://github.com/WebAssembly">
<img
src="img/wasm.png"
alt="wasm" />
</a>
</div>
<div>
<a href="https://en.wikipedia.org/wiki/JavaScript">
<img
src="img/js.png"
alt="js javascript" />
</a>
</div>
<div>
<a href="https://github.com/ggerganov/llama.cpp">
<img
src="img/llamacpp.png"
alt="llama.cpp" />
</a>
</div>
<div>
<a href="https://tangledgroup.com">
<img
src="img/tangledgroup.png"
alt="tangled tabgledgroup tangledlabs tangledhub tangledcloud tangledlab" />
</a>
</div>
</div>
</article>
</section>
</main>
</body>
</html>