Augmented reality can extend our senses and transform how we see and interact with the world around us. Designing hardware, interaction techniques, and user interfaces for this domain is equal parts exploration, anticipation, and research-driven validation… responding to shifts in how people work and the technologies their organizations employ. I am extremely thankful and privileged to work in this space and shape its future.
UX Designer & HF Engineer (2020-2021)
Information architecture, tertiary menu navigation, two-step selection methodologies, cursor, gaze-based keyboards, firmware
UX Designer & HF Engineer (2018-2021)
Anthropometric review, ergonomic evaluation, competitive assessment, iterative prototype testing, firmware, button/menu interaction behaviors
During my time at Lenovo, I provided broad coverage of our ThinkReality product solutions. Due to the nature of our delivery schedules and bandwidth, my coverage was comprehensive in scope, touching nearly every aspect of our solutions including conceptual hardware and software wireframes and mockups, information architecture planning, interaction design direction, production redlining, anthropometric measurement, research and testing, firmware specification, print materials, carrying case design, accessories, packaging direction, and competitive assessment, among other things. I also led the design and development of novel interaction guidelines for the design of several augmented reality experiences within Lenovo’s ThinkReality platform including two-step selection methodologies, dynamic tertiary menu systems, gaze-based keyboard interactions, and cursor behaviors among other areas.
Very shortly after Lenovo opted to pursue head-worn augmented reality in the commercial sector, I was tagged with providing hardware UX/HF support for the design of what would become the ThinkReality A6, Lenovo’s first enterprise-centric headset. I engaged with industrial design and development abroad in formative conceptualization of the headset, accessories, and extended compute device. With an acute focus on our targeted wearer demographics, workplace environments, ancillary helmets, clothing and tools – I provided design solutions and recommendations that would work to accommodate the sheer variability persistent across our commercial use cases. I studied and provided anthropometric measures and ranges as it pertained to the human head, body girth, height, and hands among other metrics.
I drafted protocols for several rounds of research to examine the usability of our hardware. To vet early industrial design mockups, I crafted several testing environments and activities that were inspired by the work behaviors of our targeted audiences. One such activity was an interactable PC repair task where I mocked an artificial environment for participants to replace the battery of a PC. Since we did not yet have a functional prototype to display an image, I mocked-up a touch-based instructional workflow for participants to complete as if they were interacting with a world-locked AR card set (Image A). From this we took learnings on comfort and fit accommodation among other topic areas to influence future hardware decision-making. Moreover, I devised activities that would require participants to engage in head and body extension, flexion, and rotation, as well as lateral bending and body momentum to uncover how more nuanced motions affected user performance and comfort.
I also conceptualized elements that made our product versatile and accommodating for a large range of anthropometric variation and industrial-worn accessories. More specifically, I introduced a sliding mechanism near the temples of the device to allow for concentrated fit adjustment and accelerated donning and doffing of the device, allowing it to be placed and tightened without additional touchpoints. This design also affords the user the ability to wear the headset overhead when idle – similar to a pair of sunglasses or a lifted baseball cap. I introduced a textured rear-band to accommodate a variety of hairstyles and skin behaviors like the emergence of sweat, dirt and debris. However, its rear opening was not only implemented to accommodate these features, but also uniquely tailored to adapt to existing hardhats – providing ample space for existing tightening dials to remain reachable and operable. Accessories and the extended compute pack were designed in tandem with the headset resulting in quite a frenetic work pace. Components as subjective and nuanced as button travel, cable material, battery bay door removal affordances, and many more were carefully observed and refined throughout the development process. I also focused my attention on firmware, where I provided specifications for thermal performance, button behaviors, and LED states, among other things. Being an integral part of the hardware design was incredible and I took away many learnings. As the hardware matured and entered development gates, I began supporting the design of our software solutions.
PC Repair Activity Image A
Warehouse Picking, Box Handling, Inspection Usability Tasks Image B
Software design and more specifically interaction design excites me. As you might imagine, as much as I loved co-designing the ThinkReality A6 headset, I was even more ecstatic to get into the software. As I mentioned earlier, it can often be incredibly difficult to anticipate the wide range of capabilities the wearer-base will have when they don the headset for the first time. I constantly think about how we might design certain foundational elements from both a UI and interaction level to ensure we are accommodating for novice and veteran users. I focused primarily on our selection behaviors and how we mitigate accidental selections across a myriad of input methods, most notably gaze and dwell – a timed-selection input method dependent of your head gaze position. To ensure a true hands-free selection environment without the capabilities of eye-tracking, I explored 2-step methodologies that would be naturally designed into existing UI architectures. In the form of drop-down drawers and extensible trays, selections were not as uneasily immediate and elements of the UI could be removed from the highest level, reducing complexity and visual clutter in the designs. I also led the creative visioning of our home menu’s information architecture; compartmentalizing content and features of the design into more easily understood zones within the UI. More exciting, is the team’s continued design exploration of augmented keyboards and editing tools to better accommodate gazed-based selection.
Early AR Home Prototype.
Below is a snapshot into some of the interactive prototyping work I engaged in for gaze-based interaction input. I explored the use of contextual drop-down drawers to mitigate on-screen clutter and encouraged open space at the center of each screen to reduce selection anxiety. I introduced a dynamic translation selection method for snappier navigation between top-level menu screens. I also explored side-loading confirmation selection behaviors and tertiary menus across system and quick settings to reduce cognitive workload. I defined motion behavior and animation, and designed all task flows and visual assets in Adobe XD. Dan Pollack, UX Unity Prototyper on the team, built this and many other prototypes for review and testing. This footage was captured in a headworn ThinkReality A6 HMD and accurately depicts the field of view. To view more click the green link below.