SurfaceSight: A New Spin on Touch, User, and Object Sensing for IoT Experiences

SurfaceSight: A New Spin on Touch, User, and Object Sensing for IoT Experiences

SurfaceSight is an approach that enriches IoT experiences with rich touch and object sensing, offering a complementary input channel and increased contextual awareness. For sensing, we incorporate LIDAR into the base of IoT devices, providing an expansive, ad hoc plane of sensing just above the surface on which devices rest. We can recognize and track a wide array of objects, including finger input and hand gestures. We can also track people and estimate which way they are facing. We evaluate the accuracy of these new capabilities and illustrate how they can be used to power novel and contextually-aware interactive experiences.

Published at ACM CHI 2019.

Laput, G. and Harrison, C. 2019. SurfaceSight: A New Spin on Touch, User, and Object Sensing for IoT Experiences. In Proceedings of the 37th Annual SIGCHI Conference on Human Factors in Computing Systems (Glasgow, UK, May 4 – 9, 2019). CHI ’19. ACM, New York, NY. Paper 329, 12 pages.

Related Post: