posted on 2017-10-22, 00:00authored byFrancesco Cafaro
In 1986, Buxton first imagined a future archeologist digging up a current personal computer and, because of its design, thinking that the user had one dominant hand, one eye, limited hearing and no legs. As England also noticed, the wide availability of body sensing technologies might drastically change this scenario. Most recently, off-the-shelf sensors, such as Nintendo Wii and Microsoft Kinect, have the potential—through in-home applications and interactive installations in shared, public spaces—to bring full-body interaction to the masses.
The design of controlling actions, however, remains problematic. Traditionally, gestures have been defined by designers –but it is hard to ensure that they are then intuitive to actual users. Guessability (or Elicitation) studies are an alternative approach: potential users are exposed to an "effect" (something that the system can do), and then asked to recommend a control gesture. Participants are free, however, to suggest unlimited (and often unrelated) candidate gestures.
My research addresses these problems with a novel methodology (which I call “Framed Guessability”), that allows for the design (in-lab) of complementary suites of gestures and body movements that can be easily discovered (in-situ) by the actual users of a full-body system. “Framed Guessability” incorporates Lakoff and Johnson's theory of embodied schemata into the elicitation process. By activating a specific frame (i.e. an interconnected network of brain circuits), people recommend gestures and body movements that are related with each other. This is how “Framed Guessability” improves upon the results of the aforementioned Guessability studies.
History
Advisor
Lyons, Leilah
Department
Computer Science
Degree Grantor
University of Illinois at Chicago
Degree Level
Doctoral
Committee Member
Moher, Tom
Johnson, Andrew
Antle, Alissa
Lindgren, Robb