PalmTrack
On-Palm Finger Tracking Using a Wrist-Worn Camera
PalmTrack originates from my vision to expand the limits of human-
computer interaction, transforming the palm into
an absolute positioning digitizer.
Pointing is fundamental to human–computer interaction, yet conventional devices such as mice and touchpads depend on dedicated surfaces, limiting their suitability for ubiquitous computing. While alternatives like gaze and mid-air hand tracking exist, theysuffer from issues such as fatigue and lack of tactile feedback. The human palm, with its flexibility and sensitivity, offers an ideal always-available input surface. We present PalmTrack, a wrist-worn system that transforms the palm into a touch surface using an infrared camera and computer vision algorithms. PalmTrack detects absolute finger positions and finger–palm contact in real time without calibration. A multi-task transformer network achieves a 5.9 mm mean positioning error and 98.7% touch detection accuracy (leave-one-out, N=17). Fitts’ law analysis shows input efficiency comparable to mobile touchscreens. PalmTrack further supports handwriting, eyes-free character entry, and robust outdoor use. These results demonstrate PalmTrack as a practical, accurate, and efficient input method for ubiquitous computing.
Publication:
PalmTrack: On-Palm Finger Tracking Using a Wrist-Worn Camera
Anonymous author(s)
*This paper is submitted to the CHI Conference on Human Factors in Computing Systems (CHI ’26). It is currently under review and temporarily open for application. People who see this page are kindly requested not to spread it.
Project Credits:
Ivision Group at Tsinghua University, Department of Automation, directed by Jianjiang Feng.
This project is funded by the Tsinghua University Academic Advancement Program and the Disruptive Talent Cultivation Program.
We present PalmTrack, a wrist-worn infrared camera system transforming the human palm into an always-available touch surface.
PalmTrack is a wristband camera-based pointing technique that tracks the dominant hand’s finger position on the non-dominant hand’s palm, enabling always-available and eyes-free operation. The four panels demonstrate: (1) a flip-up camera design, (2) finger-palm contact detection, (3) free positioning across the entire palm area including fingers, and (4) support for precision-demanding interactions like handwriting input.
Finger-palm interaction modes:
The study defines three interaction modes—Left Yaw, Vertical Touch, and Right Yaw—enabling touch interaction across the palm and fingers with different motion constraints. The pitch angle between the finger and palm surface can be freely adjusted across all three interaction modes.
Workflow & network architecture:
The user interacts by touching their non-dominant palm with their dominant hand’s finger. A deep learning model processes this input in real-time, predicting both touch events and precise touch positions. Upon detecting valid touch interactions, the system maps these absolute coordinates to specific applications in the real world.
A multi-task encoder-decoder model based on the Transformer architecture, capable of simultaneously predicting finger contact with surfaces and absolute position coordinates of fingers.
Hardware:
PalmTrack’s hardware prototype includes a wristband with an infrared camera with IR LEDs positioned under the wrist.
PalmTrack Applications:
1.Type with special mapping
3.AR/VR controller/ cursor controlling
Acknowledgments:
Sincerely thank the user study participants for their positive teamwork and the reviewers in UIST2025 and CHI2026 for their valuable and supportive feedback.
Copyright@gaomingze
Last Updated 2025/9