Original model: xiaol/RWKV-v5.2-7B-Role-play-16k

You can run this model with ai00_rwkv_server.

Although ai00_rwkv_server is mainly for lowend PC, you can run it on servers which are support VULKAN.

To try it in Colab, you should install libnvidia-gl-* :

!apt -y install libnvidia-gl-535

Original Model Card:

RWKV (claude) better role play with v5.2, more logic and reasonable , could follow instructions

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.