WebLLM Phi 3.5 Chat
This space enables AI chat with Phi 3.5 models directly in your local browser, empowered by WebLLM.
Step 1: Configure And Download Model
Quantization:
q4f16
q4f32
Context Window:
1024
4096
Temperature:
1.00
Top-p:
0.00
Presence Penalty:
0.00
Frequency Penalty:
0.00
Download
Step 2: Chat
Send