FLUIDITY
Fluidity was an exploration in the design of gesture-based mixed-reality menus. Unraveling how these menus can be designed to allow for seamless selection of objects and snappy navigation between many levels of data was a unique challenge for this domain. Emphasis was placed on aligning on a set of gestures and hand motions that were unique enough not to accidentally occur during normal use, but also easy enough to be learnable and reliably activated across all three-dimensional planes; x, y, and z. Moreover, these interactions would need to accommodate both top-level content and deeper levels that sit underneath. Subtle animations and visually dynamic feedback added to the usability of the menu; removing the burden of users needing to remember how each level functions and instead actively informing users as they navigate through. These subtle visual features also mitigated effects of momentum often prevalent in gesture-based experiences by giving users insights into what to expect next as they translate their hand across elements... in attempts to alleviate the often quick advent of "gorilla arm."
Role
Lead UX/UI Designer & Researcher (2016)
Vision & strategy
Domain review
State diagraming
UX concepting & Mood boarding
Wireframing
Visual design (PPT)
3D environment (Unity)
Overview.
With the emergence of natural user interfaces (NUIs), many existing paradigms developed for traditional menu design do not apply. Menu design in particular is essential to creating an accessible and usable interface in any application. Recent efforts aimed at the design of menus within NUI applications inspired the creation of this menu system. The menu design was influenced by research building on significant experience from the development of LUMENHAUS at Virginia Tech. Moreover, the application needed to support architecture students in the evaluation and continued design of interior spaces. Comprised of three uniquely featured radial menus and respective submenus, this menu paradigm introduces multi-layered menu navigation with upright hand gestures. Users can transform objects, re-color objects, and alter the pattern and material of floor and wall spaces. Using the LEAP Motion controller and dynamic visual feedback for selection translation, users can select menu items and navigate between the radial menus with ease and grace. This paradigm may be used for a myriad of applications requiring a hierarchically organized menu structure.
I developed gesture events and designed the radial menus using Microsoft PowerPoint. Dr. Denis Gracanin, associate professor in the Department of Computer Science at Virginia Tech, programmed the application in Unity C#.
The challenge associated with developing an interface which requires a high level of accuracy for a NUI, is it is often difficult to anticipate the users’ meaning of a natural motion. What often seems natural is what we have practiced to the point of continual success. Providing dynamic feedback about the actions the interface is tracking to make clear which elements are being activated and which actions the user can take is essential to NUI menu design.
With the unveiling and adoption of mixed reality devices such as Microsoft's HoloLens, walk-up free-hand displays and kiosks, and Oculus Rift virtual reality headsets and the like, accessible and usable interface design is absolutely critical to creating applications that people can enjoy. As this technology becomes more widely adopted and continues to enter more time-sensitive, sanitary, and safety critical domains such as the military, hospitals, and construction sites, developing interfaces which allow for more complex actions with larger levels of data become increasingly important.