Instructions to use Wan-AI/Wan2.2-T2V-A14B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Wan2.2
How to use Wan-AI/Wan2.2-T2V-A14B with Wan2.2:
# No code snippets available yet for this library. # To use this model, check the repository files and the library's documentation. # Want to help? PRs adding snippets are welcome at: # https://github.com/huggingface/huggingface.js
- Inference
- Notebooks
- Google Colab
- Kaggle
Long waited Wan 2.2 perfect training tutorial has arrived - as low as 6 GB GPUs - 1-click to install, setup, start training, both Windows and Cloud
#15 opened 5 months ago
by
MonsterMMORPG
wan bench 2.0
#14 opened 5 months ago
by
oucding
🚩 Report: Spam
#13 opened 6 months ago
by
ishwor2048
模型Wan2.2-T2V-A14B在哪啊?
1
#11 opened 9 months ago
by
bidedadi12345
.safetensors -> .onnx -> .engine (?)
#10 opened 9 months ago
by
manchuwook
testing
1
#9 opened 9 months ago
by
desaikushal
What is the lowest FlashAttn version necessary to run Wan2.2?
➕ 1
#8 opened 9 months ago
by
ajtakto
Vibe coded a simple but effective prompt doc for Wan 2.2 with over 100 examples
#7 opened 9 months ago
by
jikkujose
why a14b models use wan2.1 vae?
#4 opened 10 months ago
by
CHNtentes
Awesome but...
1
#3 opened 10 months ago
by
zamasam
Local Installation Video and Testing - Step by Step
👍 1
#2 opened 10 months ago
by
fahdmirzac
License?
1
#1 opened 10 months ago
by
merve