Smart glasses that make conversations more inclusive.

Captify, a startup building assistive wearables, launched their new product Captify Pro, AI-powered glasses for the deaf and hard-of-hearing community. I led the MVP design of both the mobile app and glasses interface to make the experience more intuitive, trustworthy, and practical for daily use.

Company

Captify

PLATFORM

Platform

Smart Glasses Mobile

Year

2025

Duration

6 months

Why we built Captify?

Captify was built to help people who are deaf or hard of hearing navigate real-world conversations using AI-powered AR glasses. By combining real-time captions, environmental sound alerts, and contextual awareness on a wearable display, Captify reduces communication barriers in noisy, everyday environments where traditional hearing aids fall short.

Why we built Captify?

Captify was built to help people who are deaf or hard of hearing navigate real-world conversations using AI-powered AR glasses. By combining real-time captions, environmental sound alerts, and contextual awareness on a wearable display, Captify reduces communication barriers in noisy, everyday environments where traditional hearing aids fall short.

Users & Challenges

Key Segment: Seniors (60+) Secondary Segment: Adults with partial hearing loss

Users & Challenges

Key Segment: Seniors (60+) Secondary Segment: Adults with partial hearing loss

Business Goal

Launch Captify Pro with a reliable, easy-to-use experience that drives daily adoption of AI-powered AR glasses for accessibility.

My Role

Founding product designer for Captify, leading 0→1 end-to-end design across the app and glasses UI, shaping product direction with the CEO and engineers, and grounding decisions in insights from 16 user interviews.

Impact

Compared to the previous MyVu product, Captify Pro Phase 1 MVP launched with a redesigned mobile app and glasses OS, introducing real-time captions and environmental sound detection as core experiences.

10K+

Active users

+35%

Daily active usage

2.5x

Improvement compare to Captify Myvu

40%

Returning users migrated from Captify Myvu

Impact

Compared to the previous MyVu product, Captify Pro Phase 1 MVP launched with a redesigned mobile app and glasses OS, introducing real-time captions and environmental sound detection as core experiences.

10K+

Active users

+35%

Daily active usage

2.5x

Improvement compare to Captify Myvu

40%

Returning users migrated from Captify Myvu

Main Experiences
Main Experiences

A lifestyle product for everyday communication

The redesign transformed Captify Pro from a technical prototype into a credible lifestyle product — one that blends accessibility, confidence, and ease of use. This work not only improved user trust and satisfaction but also gave Captify a stronger narrative for product launch and investor presentations.

User Insights & Design Solutions
User Insights & Design Solutions

Problem Statement

Assistive hearing tech hasn't evolved. Seniors are left navigating a world built without them.

Assistive hearing tech hasn't evolved. Seniors are left navigating a world built without them.

Assistive hearing tech hasn't evolved. Seniors are left navigating a world built without them.

Solution I

Smart Mode

Auto-switches listening modes, No manual setup needed.

Solution II

Speaker Identification

Real-time voice labeling, see who's speaking instantly

Solution III

Stay Aware

Gentle, context-based alerts, stay confident anywhere.

Solution III

Stay Aware

Gentle, context-based alerts, stay confident anywhere.

Solution III

Stay Aware

Gentle, context-based alerts, stay confident anywhere.

User Interview

What do users say about current version of Captify?

We talked to deaf and hard-of-hearing users to find out why Captify wasn’t clicking. The barriers were more than just clunky setups or technical glitches. There was a real gap in trust and comfort. We used those insights to drive every design choice we made.

Main Experience I

Smart mode

Keeps users aware of important sounds around them with simple, real-time alerts—helping them feel safe and informed in any environment.

Pain Point I

8 of 16

Users had trouble adjusting settings for different situations.

Solution I

Smart Mode

Auto-switches listening modes, No manual setup needed.

Main Experience II

Speaker Labeling & Customization

Customize speaker tags to navigate in multi-person conversations.

Pain Point II

14 of 16

Had trouble identifying speakers in group conversations.

Solution II

Speaker Identification

Real-time voice labeling, see who's speaking instantly

Main Experience III

Stay Aware

Keeps users aware of important sounds around them with simple, real-time alerts—helping them feel safe and informed in any environment.

Pain Point III

10 of 16

Wants more environmental sound engagement.

Solution III

Stay Aware

Gentle, context-based alerts, stay confident anywhere.

Design Challenges
Design Challenges

CHALLENGE: homepage experience

Challenge - Select Presets

How might we redesign the homepage to help users start captioning in one tap?

How might we redesign the homepage to help users start captioning in one tap?

How might we redesign the homepage to help users start captioning in one tap?

Current solution |

Transcribe/Translate

On the Captify Myvu model, users have to choose Translate or Transcribe before they can do anything, an extra step that gets in the way of the most basic task: turning on captions.

On the Captify Myvu model, users have to choose Translate or Transcribe before they can do anything, an extra step that gets in the way of the most basic task: turning on captions.

User flow |

Transcribe as default

I made Transcribe the default and moved Translate to a secondary option. With Smart Mode, users can start live captioning in one tap — no more choosing upfront.

I made Transcribe the default and moved Translate to a secondary option. With Smart Mode, users can start live captioning in one tap — no more choosing upfront.

1st Version |

HISTORY AS HOMEPAGE

I put transcript history on the homepage so users could check, share, and delete past conversations.

I put transcript history on the homepage so users could check, share, and delete past conversations.

🫠

🫠

Why not work?

Title

Why not work?

  1. Transcript history wasn't being used



  2. Translate and Transcribe tabs created unnecessary friction

  3. Glass connection status not visible to users

2nd Version |

SHOWING DEVICE STATUS

I brought glasses status and Smart Mode to the front. Users could now see if their glasses were connected right away.

I brought glasses status and Smart Mode to the front. Users could now see if their glasses were connected right away.

😍

What works well

Clear structure; obvious flow

🫠

🫠

What needs improve?

What needs improve?

1. Navigation tabs
2. Switch in between transcribe/translate needs more work

3rd Version |

ALL-IN-ONE HOMEPAGE

I added the full Smart Mode grid, sound alerts, and quick tools. The page started to come together but the information hierarchy still needed work.

I added the full Smart Mode grid, sound alerts, and quick tools. The page started to come together but the information hierarchy still needed work.

😍

What works well

  1. Separate tab to view history

  2. Transcribe as default, translate as an option

  3. Glass connection status displayed at the top of the homepage

🫠

🫠

What needs improve?

What needs improve?

Information hierarchy

Final Version |

ALL-IN-ONE HOMEPAGE

We have the transcript history display as homepage, the main action is to check, share, and delete the past conversations

We have the transcript history display as homepage, the main action is to check, share, and delete the past conversations

Final Design

Designed for faster, more reliable everyday use

By prioritizing real user needs, the experience now feels effortless and adaptive. Users can manage hearing and transcription seamlessly with minimal manual input.

What I learned
What I learned
Even the most empathetic designer learns the most from real conversations

Talking directly with deaf and hard-of-hearing users reinforced that no assumption replaces lived experience. Their daily challenges shaped clearer, more empathetic design decisions.

Even the most empathetic designer learns the most from real conversations

Talking directly with deaf and hard-of-hearing users reinforced that no assumption replaces lived experience. Their daily challenges shaped clearer, more empathetic design decisions.

Designing for Ease, Not Effort

In accessibility-focused products, simplicity is a strength. Too many options created hesitation, especially for older users. Designing for quick actions, adaptive captions, and smart defaults helped keep the focus on conversation, not controls.

Designing for Ease, Not Effort

In accessibility-focused products, simplicity is a strength. Too many options created hesitation, especially for older users. Designing for quick actions, adaptive captions, and smart defaults helped keep the focus on conversation, not controls.

What I’d Do Differently

I would push further on clarity: higher contrast, larger touch targets, clearer labels, and a simplified mode to reduce visual clutter and cognitive load.

What I’d Do Differently

I would push further on clarity: higher contrast, larger touch targets, clearer labels, and a simplified mode to reduce visual clutter and cognitive load.

Next steps
Next steps

AI Speaker Identification

Allow users to label frequent speakers so the system can recognize and tag voices automatically, improving context and accuracy over time.

Manage Notifications & Transcripts

Refine alerts and transcripts by importance, delete outdated content, and improve long-session performance for real-world use.