One of the challenges of instrumenting everyday objects with sensors is the added cost, size, weight and need for batteries (and more importantly battery replacement) that is typically associated with wireless sensor nodes. With the advent of low cost embedded UHF RFID readers that now report low level channel parameters such as RSSI and RF Phase, it is now possible to turn ordinary UHF RFID tags into wireless and battery free sensors. The following projects explore is paradigm to create methods for human object interaction detection, rapid prototyping of functional user interfaces and enabling natural human robot interactions.
In order to enable unobtrusive human object interaction detection, we propose a minimalistic approach to instrumenting everyday objects with passive (i.e. batteryfree) UHF RFID tags. By measuring the changes in the physical layer of the communication channel between the RFID tag and reader (such as RSSI, RF phase, and read rate) we are able to classify, in real-time, tag/object motion events along with two types of touch events. Through a user study, we demonstrate that our real-time classification engine is able to simultaneously track 20 objects and identify four movement classes with 93% accuracy. To demonstrate how robust this general-purpose interaction mechanism is, we investigate three usage scenarios 1) interactive storytelling with toys 2) inference of daily activities in the home 3) identification of customer browsing habits in a retail setting.
We describe techniques that allow inexpensive, ultra-thin, battery-free Radio Frequency Identification (RFID) tags to be turned into simple paper input devices. We use sensing and signal processing techniques that determine how a tag is being manipulated by the user via an RFID reader and show how tags may be enhanced with a simple set of conductive traces that can be printed on paper, stencil-traced, or even hand-drawn. These traces modify the behavior of contiguous tags to serve as input devices. Our techniques provide the capability to use off-the-shelf RFID tags to sense touch, cover, overlap of tags by conductive or dielectric (insulating) materials, and tag movement trajectories. Paper prototypes can be made functional in seconds. Due to the rapid deployability and low cost of the tags used, we can create a new class of interactive paper devices that are drawn on demand for simple tasks. These capabilities allow new interactive possibilities for pop-up books and other papercraft objects.
RFID tags can be used to add inexpensive, wireless, batteryless sensing to objects. However, quickly and accurately estimating the state of an RFID tag is difficult. In this work, we show how to achieve low-latency manipulation and movement sensing with off-the-shelf RFID tags and readers. Our approach couples a probabilistic filtering layer with a montecarlo-sampling-based interaction layer, preserving uncertainty in tag reads until they can be resolved in the context of interactions. This allows designers’ code to reason about inputs at a high level. We demonstrate the effectiveness of our approach with a number of interactive objects, along with a library of components that can be combined to make new designs.
Technologies that allow autonomous robots and computer systems to quickly recognize and interact with individuals in a group setting has the potential to enable a wide range of personalized experiences. However, existing solutions fail to both identify and locate individuals with enough speed to enable seamless interactions in very dynamic environments that require fast, implicit, non-intrusive, and ubiquitous recognition of users. In this work, we present a hybrid computer vision and RFID system that uses a novel reverse synthetic aperture technique to recover the relative motion paths of an RFID tags worn by people and correlate that to physical motion paths of individuals as measured with a 3D depth camera. Results show that our real-time system is capable of simultaneously recognizing and correctly assigning IDs to individuals within 4 seconds with 96.6% accuracy and groups of five people in 7 seconds with 95% accuracy. In order to test the effectiveness of this approach in realistic scenarios, groups of five participants play an interactive quiz game with an autonomous robot, resulting in an ID assignment accuracy of 93.3%.
Best Paper Award CHI 2016-"RapID: A Framework for Fabricating Low-Latency Interactive Objects with RFID Tags"; Andrew Spielberg, Alanson Sample, Scott E. Hudson, Jennifer Mankoff, and Jim McCann; ACM Conference on Human Factors in Computing Systems (CHI), May 7–12, 2016
RFID Journal: "Disney Research Explores Ways to Add RFID Intelligence to Robots, Toys" - May 2016
Seattle Times: " Piece of paper that connects to Internet? UW, Disney make it a reality " - May 2016
TechCrunch: "Disney Research uses RFID tags to create powerless, low-cost interactive controllers" - May 2016
RFID Journal: "RFID for Reading People's Reactions" - Aug 2015
Technonize: "Human Object Interaction Detection System by Disney" - April 2015
Research efforts on Interactive RFID started at Disney Research with the ID-Sense project. Since then there have been several follow up research projects with collaborators at the University of Washington and Carnegie Mellon University.