Testing the position of the head, blinking and if the mouth opens.
The idea is to generate a digital rotoscope of the face for future animations, the rotoscope is a very old technique in which the movements of a video are calculated to generate an animation.
A classic example of this is the Gulliver of the Fleichers, in fact they used a lot that technique of Betty Boop dances were rotoscopes of very famous artists at the time.
Lately, let’s see what it takes a person’s face to animate this was formerly a little rough and you always noticed that it was a video drawn or painted.
But with the tools available.
In my current tests I am taking very simple mode the head positions with numbers from -1 to 1 for the positions and 0 or 1 for the opening of the mouth.
The idea is to send this data to other programs such as blender, synfigstudio, opentoonz or generate an animation on the flight using ffmpeg to generate a live puppet for streaming type broadcast.
Example mouse move with head:
python3 expresion.py | while read A; do
xdotool mousemove_relative -- "$(echo "scale=0; 10*$RH" | bc )" "$(echo "scale=0; -10*$RV" | bc )"