Using webcam to control Live2D Cubism model in Linux
TL;DR: Code: GitHub, self-hosted, demo: video below
I was researching how to get FaceRig to work on Linux, and eventually concluded that it is pretty much impossible. So I thought, hey, why not just code it myself? It can’t be that difficult, right? I mean, it probably won’t be as feature-rich as the original, but all I need / want is the basic “avatar moving along with my face” functionality.
So I spent a couple weekends, digging up the (very poorly translated) Live2D docs (but still, kudos to Live2D Inc for actually translating them), struggled with some questionable anti-patterns, and arrived with this:
(Signature) Also available on YouTube
Source code available on GitHub and my own server. Yes the code is kind of messy and could do some refactorings, but it works so I couldn’t be bothered. ¯\_(ツ)_/¯
With some input and suggestions from another user I have also created a spin-off project, where instead of using facial tracking, the avatar is controlled by mouse cursor position and customizable via a UI.
Some more demos (not really that important so I haven’t bothered self-hosting them):
- Comparing facial tracking and mouse tracking
- CLI controls for mouse tracking
- GUI controls for mouse tracking
Source code for this spin-off project: GitHub, self-hosted.
Please feel free to get in touch for any comments, suggestions, bug reports, questions, or any kind of feedback!