Multimodal Personal AI for AR glasses
Reimagined Noa, a personal AI for lightweight AR glasses, and its mobile app—designing proactive prompts and contextual features to support everyday life.
Goals
Expand the user base and drive daily adoption for the new product launch, while building a core design that can scale and evolve quickly.
My Role
Led product strategy and design: defining product vision, AI UX system, visual design system, and delivering app design prototypes and final designs.
Team
CEO, Marketing, Engineers
Problem Statement
How might we help users make daily life more efficient and meaningful with the new AR glasses?
Challenges
Our business goal was to boost user engagement and expand our reach from developers to everyday users, aiming to understand exactly what these users expect from AI in lightweight AR glasses. However, there was no clear product vision for the experience we wanted to build, and no existing usage data to start with.
My approach & Solution
Move fast. Learn by building. Collaborate closely.
I designed & prototyped 7 different product ideas across 2 sprints, partnering with CEO and engineers, helping teams to discover new opportunities and refine our vision iteratively.
Our phase 1 solution was to create dynamic prompts & summaries tailored to users’ daily patterns and enhance the core chat interactions for those who prefer typing over speaking in public spaces.
Designing Noa was about creating an personal AI that understands when to step in, how to surface insights meaningfully, and how to make AI feel intuitive, not intrusive.
Understanding the core user needs was the first step.
In order to identify new target users and their needs, I started by uncovering current user insights through manual community research and team interviews.
Our current users — mainly hackers and developers — actively shared their projects and pain points through our customer support channel in the Discord community.
With the user insights and patterns I analyzed, I organized a workshop with the team to refine user archetypes and map out their journeys, needs, and feature priorities.
Moving fast, building to learn, collaborating closely
I took a hands-on, iterative approach—moving fast, learning through building, and shaping the MVP scope through close collaboration.
Across 7 concept explorations, I helped define the project scope and surfaced two key opportunities for MVP1, refining them through rapid iteration and open communication over 2 sprints.
Imagine your AR glasses don’t just answer questions—they remember.
Noa reminds you to grab a jacket because knows you were cold yesterday. At the end of the day, it doesn’t just transcribe events—it summarizes and gently brings back the moments that mattered, helping you reflect, process, and stay connected to what’s important.
For the MVP, I redesigned the mobile app (IA, UI, interaction, and visual system) and created an AI UX system for LLM prompts and daily summaries.
Designed dynamic LLM prompts system based on target users’ need
Adaptive daily Summary that learns and grows with user
Dynamic main chat interaction
Introduced a flexible and responsive user interface that adjusts to the context of interactions, making technology feel more like a natural extension of the user.
Human drawn illustrations, colors, add more ‘inperfect’ natural tone
Be transparent with data collection
Created content modals that commnunicates transparent data collection and policy.
adding this throughout the flow - brings more transparency
Key AI design principle
I focused on making Noa AI feel more personal and human, but also easy to use. My goal was for every interaction to feel helpful and natural — like talking to a trusted companion, not like being surveilled.