MobileGaze JS
Real-Time Gaze Estimation
RUN
This tool brings the yakhyo/gaze-estimation ONNX inference path
into the EZ-MMLA Toolkit as a browser-based webcam workflow.
It keeps the same core model behavior as the upstream repository: faces are
cropped, resized to 448 x 448, normalized with ImageNet
statistics, passed through the gaze estimation model, and decoded into yaw
and pitch angles before a gaze vector is drawn back on screen.
Within the Toolkit, it follows the same run workflow as the other JS tools: webcam capture, recording controls, downloadable CSV predictions, and an overview panel summarizing the session.
| Output | Description |
|---|---|
yaw_degrees |
Horizontal gaze angle predicted by the ONNX model. |
pitch_degrees |
Vertical gaze angle predicted by the ONNX model. |
face_score |
Confidence score from the face detector used to crop the face. |
face_index |
Per-frame face identifier for multi-person recordings. |