Originally, I intended to create a generative visualization that responds to facial movements in a fluid and organic way. I wanted the keypoints of the face to emerge dynamically, rather than appearing all at once. My goal was to make the face feel like it’s being “sketched” in real-time, with lines and points forming in an unpredictable yet structured way. I was also interested in incorporating motion trails and noise distortions to make the drawing feel more alive and reactive rather than a rigid face-tracking system.
https://editor.p5js.org/wallflower/full/kvE9FrGe4
• First, I set up ml5.js FaceMesh to track facial keypoints.
• I experimented with drawing basic keypoints to understand their behavior.
• To avoid a static appearance, I applied Perlin noise to each keypoint’s position, adding an organic, sketch-like effect.
• At first, I applied large noise values, but it made the face too distorted, so I reduced the range.
• Instead of drawing all keypoints at once, I introduced currentPoint, which slowly reveals the face.
• This approach mimics the way an artist sketches—starting with a few lines and progressively adding details.
• I mapped the x/y positions of keypoints to color values to create dynamic color variations.
• I also connected keypoints with lines, making the sketch feel more cohesive.
• To enhance the sense of movement, I stored the previous positions of keypoints and created fading trails.