dora-rs torch==2.2.0 flash_attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl autoawq autoawq-kernels sounddevice openai-whisper pynput opencv-python Pillow transformers pyttsx3