Text Generation
Transformers
Safetensors
Korean
English
llama
conversational
text-generation-inference
Inference Endpoints
llama-3.2-3B-wildguard-ko-2410 / special_tokens_map.json

Commit History

Upload tokenizer
64d0d22
verified

heegyu commited on