Hi!
Now I can animate a 2D character in Spine using my face for the motion tracking
Just a face reference with tracking markers and the Python script magic
So… I’m not a developer. I used Ai for script writing.
But now I have a Tracking script that reads the markers' coordinates from the video reference (PNG sequence) and converts them into json format with bone animation in Spine.
I did it
PNG sequence → Marker coordinates → JSON → Bone animation in Spine.
No manual keyframing - just tracking → JSON → Spine animation
More details on my LinkedIn
It’s not perfect yet. The Tracking script still needs polishing.
But it already works and opens up exciting new possibilities for bringing face tracking into 2D animation pipelines in Spine software.
You can now animate a character’s with your face
More to come. Stay tuned...
ps. I used face video reference from Nagapimocap channel: https://lnkd.in/eiqGdmmY