Spaces:
Runtime error
Runtime error
readme
Browse files
README.md
CHANGED
@@ -1,81 +1,14 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
|
7 |
-
<!-- [![ Paper](https://img.shields.io/badge/Paper-MotionCtrl-red
|
8 |
-
)](https://wzhouxiff.github.io/projects/MotionCtrl/assets/paper/MotionCtrl.pdf)   [![ arXiv](https://img.shields.io/badge/arXiv-2312.03641-red
|
9 |
-
)](https://arxiv.org/pdf/2312.03641.pdf)   [![Porject Page](https://img.shields.io/badge/Project%20%20Page-MotionCtrl-red)
|
10 |
-
](https://wzhouxiff.github.io/projects/MotionCtrl/)   [![Demo](https://img.shields.io/badge/Demo-MotionCtrl-orange
|
11 |
-
)]() -->
|
12 |
-
|
13 |
-
[![ Paper](https://img.shields.io/badge/Paper-gray
|
14 |
-
)](https://wzhouxiff.github.io/projects/MotionCtrl/assets/paper/MotionCtrl.pdf)   [![ arXiv](https://img.shields.io/badge/arXiv-red
|
15 |
-
)](https://arxiv.org/pdf/2312.03641.pdf)   [![Porject Page](https://img.shields.io/badge/Project%20Page-green
|
16 |
-
)
|
17 |
-
](https://wzhouxiff.github.io/projects/MotionCtrl/)   [![Demo](https://img.shields.io/badge/Gradio%20Demo-orange
|
18 |
-
)]()
|
19 |
-
|
20 |
-
[Zhouxia Wang](https://vvictoryuki.github.io/website/)<sup>1,2</sup>, [Ziyang Yuan](https://github.com/jiangyzy)<sup>1,4</sup>, [Xintao Wang](https://xinntao.github.io/)<sup>1,3</sup>, [Tianshui Chen](http://tianshuichen.com/)<sup>6</sup>, [Menghan Xia](https://menghanxia.github.io/)<sup>3</sup>, [Ping Luo](http://luoping.me/)<sup>2,5</sup>, [Ying Shan](https://scholar.google.com/citations?hl=zh-CN&user=4oXBp9UAAAAJ)<sup>1,3</sup>
|
21 |
-
|
22 |
-
<sup>1</sup> ARC Lab, Tencent PCG, <sup>2</sup> The University of Hong Kong, <sup>3</sup> Tencent AI Lab, <sup>4</sup> Tsinghua University, <sup>5</sup> Shanghai AI Laboratory, <sup>6</sup> Guangdong University of Technology
|
23 |
-
|
24 |
-
|
25 |
-
</div>
|
26 |
-
|
27 |
-
<!-- ## Results of MotionCtrl -->
|
28 |
-
|
29 |
-
Our proposed <b>MotionCtrl</b> is capable of independently controlling the complex camera motion and object motion of the generated videos, with <b>only a unified</b> model.
|
30 |
-
There are some results attained with <b>MotionCtrl</b> and more results are showcased in our [Project Page](https://wzhouxiff.github.io/projects/MotionCtrl/).
|
31 |
-
|
32 |
-
<!-- </br>
|
33 |
-
<video poster="" id="steve" autoplay controls muted loop playsinline height="100%" width="100%">
|
34 |
-
<source src="https://wzhouxiff.github.io/projects/MotionCtrl/assets/videos/teasers/camera_d971457c81bca597.mp4" type="video/mp4">
|
35 |
-
</video>
|
36 |
-
<video poster="" id="steve" autoplay controls muted loop playsinline height="100%" width="100%">
|
37 |
-
<source src="https://wzhouxiff.github.io/projects/MotionCtrl/assets/videos/teasers/camera_Round-R_ZoomIn.mp4" type="video/mp4">
|
38 |
-
</video>
|
39 |
-
<video poster="" id="steve" autoplay controls muted loop playsinline height="100%" width="100%">
|
40 |
-
<source src="https://wzhouxiff.github.io/projects/MotionCtrl/assets/videos/teasers/shake_1.mp4" type="video/mp4">
|
41 |
-
</video>
|
42 |
-
<video poster="" id="steve" autoplay controls muted loop playsinline height="100%" width="100%">
|
43 |
-
<source src="https://wzhouxiff.github.io/projects/MotionCtrl/assets/videos/teasers/s_curve_3_v1.mp4" type="video/mp4">
|
44 |
-
</video> -->
|
45 |
-
|
46 |
-
<div align="center">
|
47 |
-
<img src="assets/hpxvu-3d8ym.gif", width="600">
|
48 |
-
<img src="assets/w3nb7-9vz5t.gif", width="600">
|
49 |
-
<img src="assets/62n2a-wuvsw.gif", width="600">
|
50 |
-
<img src="assets/ilw96-ak827.gif", width="600">
|
51 |
-
</div>
|
52 |
-
|
53 |
-
|
54 |
-
## Updating
|
55 |
-
- [x] Release MotionCtrl depolyed on *LVDM/VideoCrafter*
|
56 |
-
- [ ] Gradio Demo Available
|
57 |
-
|
58 |
-
<!-- ## training
|
59 |
-
sh configs/training/train_cmcm.sh
|
60 |
-
sh configs/training/train_omcm_dense.sh
|
61 |
-
sh configs/training/train_omcm_sparse.sh -->
|
62 |
-
|
63 |
-
## Inference
|
64 |
-
|
65 |
-
1. Download the weights of MotionCtrl from [weipan](https://drive.weixin.qq.com/s?k=AJEAIQdfAAogLtIAPh) to `./checkpoints`.
|
66 |
-
2. Go into `configs/inference/run.sh` and set `condtype` as 'camera_motion', 'object_motion', or 'both'.
|
67 |
-
- `condtype=camera_motion` means only control the **camera motion** in the generated video.
|
68 |
-
- `condtype=object_motion` means only control the **object motion** in the generated video.
|
69 |
-
- `condtype=both` means control the camera motion and object motion in the generated video **simultaneously**.
|
70 |
-
3. sh configs/inference/run.sh
|
71 |
-
|
72 |
-
## Citation
|
73 |
-
If you make use of our work, please cite our paper.
|
74 |
-
```bibtex
|
75 |
-
@inproceedings{wang2023motionctrl,
|
76 |
-
title={MotionCtrl: A Unified and Flexible Motion Controller for Video Generation},
|
77 |
-
author={Wang, Zhouxia and Yuan, Ziyang and Wang, Xintao and Chen, Tianshui and Xia, Menghan and Luo, Ping and Shan, Yin},
|
78 |
-
booktitle={arXiv preprint arXiv:2312.03641},
|
79 |
-
year={2023}
|
80 |
-
}
|
81 |
-
```
|
|
|
1 |
+
---
|
2 |
+
title: MotionCtrl
|
3 |
+
emoji: π
|
4 |
+
colorFrom: indigo
|
5 |
+
colorTo: pink
|
6 |
+
sdk: gradio
|
7 |
+
sdk_version: 4.12.0
|
8 |
+
app_file: app.py
|
9 |
+
pinned: false
|
10 |
+
license: apache-2.0
|
11 |
+
---
|
12 |
+
|
13 |
+
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
|
14 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|