Experienced product designer making an impact in AgeTech.

With over 20 years of experience at Google, Microsoft, and Motorola, I specialize in mobile UX and accessible design, applying my skills to solve meaningful challenges for older adults and people with disabilities.

Featured Projects

Google Pixel

Pixel Magnifier

I led the redesign of the Magnifier feature, transforming it from a basic utility into a highly-rated, accessible tool for users with low vision, earning praise from leading accessibility organizations.

Pixel Magnifier UI Screenshot
Simple View UI Screenshot
Google Pixel

Simple View

To help older adults in Japan adopt smartphones, I designed a feature that simplifies the core phone experience, achieving an 80% conversion rate at point-of-sale for our carrier partners.

Google Pixel (Private Beta)

Guided Step

I led the design for 'Guided Step,' uncovering foundational insights into how blind users navigate crowded spaces, which directly informed the company's future accessibility strategy.

Guided Step UI Screenshot

More Projects

Nest Home

Nest Home Lights System

Skill: Systems Thinking & Simplicity

Challenge: Create an easy-to-remember light system to communicate speaker status and control.

Impact: Established the iconic 4-dot LED system on over 25 million speakers, winning multiple Red Dot awards and creating an industry standard.

Google Assistant

Calling on Assistant

Skill: Voice UX & Product Strategy

Challenge: The initial product direction for calling didn't match users' mental models for making calls.

Impact: Led a strategic pivot to person-to-person calling, resulting in 450,000+ MAUs and over 1.5 million calls per month.

About Me

I'm a staff UX designer with over 20 years of experience shaping products at the world's top tech companies. I specialize in mobile UX and accessible design. My current focus is applying these skills to solve meaningful challenges and create best-in-class experiences for older adults.

Google Pixel

Pixel Magnifier

Animated GIF of the Magnifier live search feature in action.

Role

Lead UX Designer

Goal

Create a best-in-class magnifier app to improve accessibility on Pixel devices.

Team

PM, Engineering, UXR, Visual Design

The Challenge

People with low vision rely heavily on their phone's camera to magnify everyday things like menus and bills. While functional, it is a makeshift solution. Our competitor, iPhone, had a dedicated Magnifier app, and Pixel needed to provide an equally powerful, accessible, and intuitive tool for its users that could go beyond our competition.

Discovery & Research

Our research involved competitive analysis, institutional partnerships with organizations like the Royal Institute for the Blind, and extensive user interviews. The most surprising insight came from observing users: due to their low vision, they hold the phone extremely close to their faces, often with screen magnification at maximum. This meant that moving from one edge of the screen to another could require multiple swipes, making feature discovery a huge challenge.

Screenshot showing extreme screen magnification during user testing.
This screenshot from a research session shows the "close-up perspective" of a low-vision user with screen magnification at maximum. This insight was critical in our decision to cluster all UI controls.

Design & Iteration

This "close-up perspective" insight completely changed our design approach. We initially explored placing controls across the screen, similar to the main Camera app. However, testing showed this was unusable for our target audience. We pivoted to a design that clustered all core functionality in a small, predictable area at the bottom of the screen. While it might look "unbalanced" to a normally-sighted person, it proved to be far more usable and accessible for low-vision users.

Concept A: Rejected

Wireframe of a spread-out camera-like UI.

Although this concept clustered controls at the bottom, testing revealed users still struggled with the menu and the small icon sizes.

Concept B: Rejected

Wireframe with controls in a different layout.

This version improved the icon sizes, but still required too much horizontal screen travel for users with extreme magnification.

Final Design

Wireframe showing the final clustered UI.

Clustering all controls at the bottom proved most accessible and became the chosen direction.

This insight solved the UI layout problem, but it didn't solve the core challenge of scanning dense real-world text. We couldn't reformat a restaurant menu or reorganize a grocery aisle for the user. Our solution was to create a powerful 'live search' feature. Now, instead of manually scanning from edge-to-edge, users could simply type a word—like a dish on a menu or a specific spice—and have Magnifier instantly highlight it for them in the viewfinder.

Animated GIF showing the Live Search feature highlighting text on a menu.
The final "Live Search" feature in action, allowing users to find specific text without needing to manually scan.

Design Considerations

A key challenge was convincing leadership that this unconventional, bottom-heavy layout was the right decision. We also had to standardize the "search" interaction, auditing various Google apps like YouTube to create a pattern that was both familiar and optimized for repeated use within Magnifier.

Solution & Impact

The final app provides powerful magnification, filters, and a unique live search feature, all accessible from a simple, consolidated control panel. The launch was a huge success, solidifying Pixel's commitment to accessibility.

4.8 Star Rating

The highest rated Google Play store app to date.

"This is (ahem) magnificent... I have it set up so two taps on the back of the phone open the magnifier, and I use it nearly every day.”

- Ron Lusk, 5-star Play Store Review

"One user expressed excitement that the live search 'found it right away. And **how easy could that be?** That, I mean, that is **awesome**. That is **convenient**.'"

- Internal User Research Participant
Google Pixel

Simple View

Screenshot of the Simple View feature on a Pixel phone.

Role

Lead UX Designer

Goal

Increase Pixel adoption among older adults in the Japanese market.

Team

PM, Engineering, UXR

The Challenge

While Pixel sales were strong in Japan, we were missing a large segment of the market: older adults. Research showed this demographic found parts of standard smartphone experience complex and visually overwhelming. Our goal was to create an inviting and accessible "front door" to the Pixel to drive adoption.

Discovery & Research

We conducted extensive research in Japan, interviewing older adults, consulting with academics on aging, and even visiting a company dedicated to helping seniors with technology. A key insight was that older adults needed increased legibility but didn't think of themselves as "disabled," so they were reluctant to dig into Accessibility settings. They needed a simple, mainstream option.

Photo of a user research session with older adults in Japan.
Conducting a focus group with older adults in Japan. These direct conversations were essential for understanding their unique challenges and perspectives on smartphone technology.

Design & Iteration

Our initial design placed "Simple View" within the Accessibility menu. Testing quickly revealed this was undiscoverable. This feedback led to a critical pivot: we moved the feature directly into the main phone setup wizard (SUW). This not only boosted visibility but also allowed us to pre-select a helpful set of home screen icons, further reducing the initial cognitive load for new users.

The user flow for enabling Simple View was moved from deep within Settings to a prominent option in the initial phone Setup Wizard (SUW), dramatically increasing discoverability.

Step 1: Welcome

Simple View setup screen 1

Step 2: Option

Simple View setup screen 2

Step 3: Confirmation

Simple View setup screen 3

Step 4: Final Home

Simple View final home screen

Solution & Impact

Simple View provides a clean home screen with large icons and text, 3-button navigation, and an enhanced keyboard. By making it a primary choice during setup, we made accessibility feel like a standard feature, not a hidden fix. The results with our Japanese carrier partners were phenomenal.

Up to 80% Conversion Rate

Achieved in selected Japanese carrier points of sale.

"It made the phone look less cluttered and simpler to use.”

- User Feedback
Google Pixel (Private Beta)

Guided Step

Please Note: This case study details a private research beta that is not a publicly available product. The findings have been approved for public sharing by Google and were also submitted in a formal paper for presentation at the CHI 2026 conference.

Conceptual image for the Guided Step feature.

Role

Lead UX Designer

Goal

Explore using the Pixel camera as a guide cane supplement for blind users in crowded spaces.

Team

UXR, PM, Sound Design, Engineering

The Challenge

For individuals who are blind or have low vision (BLV), independent navigation presents significant challenges and risks. While guide canes are essential, they don't detect obstacles at trunk or head level. We hypothesized that a smartphone app using AI to detect people and obstacles in real-time could serve as a valuable supplement, increasing safety and confidence.

The Process: A Global Co-Design

We conducted three rounds of iterative co-design and testing with 15 BLV participants around the world, including in Tokyo, Paris, and the US. We tested our prototypes in extreme environments like Shibuya station to gather feedback through direct observation, interviews, and user journals.

Photo of a research participant testing the Guided Step prototype in a busy urban environment. Another photo of a research participant using Guided Step.
Research participants testing the Guided Step prototype in various environments. These real-world sessions were crucial for gathering feedback on the app's effectiveness.

Key Learnings: The Three Pillars

The extensive co-design process revealed three foundational pillars for designing this kind of assistive technology:

Guided Setup

Onboarding screen for Guided Step

For a tool that could be harmful if misused, building trust and proficiency from the first second was non-negotiable.

Minimalist UI

Minimalist UI for Guided Step

To optimize for screen readers, we eliminated submenus and focused on a simple, scannable interface.

Full Customization

Settings screen for Guided Step showing customization options

Each user preferred a different intensity and type of audio feedback, making a highly customizable experience essential.

The Impact: Validated Learnings

While Guided Step did not move into production, its impact was validated in an experimental study with 20 BLV participants. The study, submitted for presentation at the CHI 2026 conference, demonstrated that the app significantly improved navigation safety and confidence.

-44%

Fewer Unintentional Obstacle Hits

-36%

Reduction in Total Obstacles Hit

-34%

Decrease in "Near-Miss" Events

"I am more confident that I will not crash or fall... It can detect things that's further out than the cane. It's good to know what's around you and have awareness of people and cars."

- Study Participant

Ultimately, the "three pillars" we uncovered provided a strategic framework for future accessibility projects at Google, demonstrating the critical need for deep user partnership when designing for specialized communities.