September 2024 - Ongoing

Role
UX Researcher & Physical Prototyper
Timeline
January 2025 - May 2025
Tools
Figma, Python, SQL, Arduino, AutoDesk Fusion, Prusa, Blender, Videographer
Teammates
Jiawen Chen - Architect ①
Omar Mohammad - Designer①
Nile Tan - Designer/Film maker ①
Client
The Asian Art Museum
Museum curation is often divorced from community input, resulting in a museum experience that primarily reflects the intentions of the museum authorities. This often undercuts the benefits community members and museum visitors could experience by more community-informed modes of curation.
We designed and prototyped a museum navigation algorithm that is built on voice-based community-art interactions at the Asian Art Museum in San Francisco. The system includes a screenless device to foster inter-visitor engagement and deepen art understanding. It also opens up responsive and active models of museum-visitor interaction.
Role
UI/UX Product Designer (Sole Designer)
Timeline
January 2026 - Ongoing
Tools
Figma, Google NotebookLM, Python, V0 by Vercel, ZeroWidth LLM, Postman, BeautifulSoup, Selenium
Teammates
Product Mangers ②
Front-end Enginner ①
Environmental Product Declaration (EPD) data is essential for compliance and sustainability reporting, yet for this enterprise client, validation is often manual, fragmented, and hard to scale. Internal teams rely on disconnected tools and informal processes, making accuracy, consistency, and audit readiness difficult to maintain as operations grow.
Replaced fragmented environmental data reviews with a structured, system-driven validation workflows, introducing clear stages, ownership, and feedback loops across facilities and materials. i centralized issues, flags, and approvals to improve EPD readiness, reduced ambiguity around review states, and established scalable design and interaction patterns intended to serve as the foundation for future enterprise sustainability and reporting products.
I worked with incomplete user access, high regulatory risk, the need to design a scalable system that could support multiple roles and future products and I needed to deliver a this as a confident and tested system within a very short timeframe.
I grounded my design decisions in direct user research(interviews and focus groups), desk research, competitive analysis, and established patterns from enterprise and project management software. Where I felt uncertain, I treated my decisions as assumptions, designing a system that could flex as those assumptions are tested and refined.





Role
Lead Product Designer
Timeline
March 2021 - May 2021
Tools
Nomad Sculpt, Fusion 360, Rhino 3D, Adobe Illustrator, KeyShot
Teammates
Mechanical Engineers ②
Business Strategists ③
Product Manager ①
Architect ①
Abu Dhabi’s Corniche hosts thousands of daily visitors like walkers, joggers, cyclists, and families. Yet, the absence of safe, inclusive drinking fountains forces reliance on bottled water, contributing to waste and excluding wheelchair users. Traditional fountains often go unused due to hygiene concerns and outdated design.
Delivered to the UAE Ministry of Urban Planning, we designed a UV-sanitized, foot-pedal fountain for Abu Dhabi’s Corniche. Fully accessible and solar-powered, the design inspired plans to install public fountains along the Corniche, offering a scalable solution to eliminate 180,000 plastic bottles annually.




.png)










.png)

The final design is a fully self-sanitizing, hands-free public drinking fountain created specifically for Abu Dhabi’s Corniche, accessible, sustainable, and grounded in local visual language. With its dual-height sinks, foot-pedal activation, and solar-powered UV sanitation, it offers a safer, more inclusive alternative to bottled water in one of the city’s most visited public spaces.
The design was presented to the UAE Ministry of Urban Planning, where it contributed to early-stage efforts to reintroduce drinking fountains in Abu Dhabi. Our proposal helped shape conversations around sustainable urban hydration and influenced plans to bring more accessible water infrastructure to the Corniche.
I'm incredibly proud of what we achieved, not just as a functional product, but as a rethinking of how public design can serve health, sustainability, and equity all at once.
Massive thanks to Eunseo Bong, Samantha Lau, and Zak Saeed for being thoughtful, talented, and tireless collaborators throughout the process.

Role
Lead UI/UX Product Designer
Timeline
Ongoing (Prototype completed)
Tools
Figma, Python, BeautifulSoup, Selenium, OpenAI API, ZeroWidth LLM, Lucidchart, Postman
Teammates
Software Engineers ②
AI/ML Engineers ②
Business Strategists ③
Product Manager ①
Professionals often collect contacts at events, through LinkedIn, or referrals, but struggle to keep them organized, remember context, and follow up in a timely way. Existing tools store information but don’t support strategic, actionable networking.
I designed and helped build an AI-powered tool that helps users collect, organize, and follow up with contacts more efficiently, turning scattered information into clear next steps creating a 70% reduction in manual effort and a 3x increase in follow-up effectiveness.
Through in-person interviews, questionnaires with young and middle-aged professionals (who are our target user-group), and usability testing of current tools in the market we found that professionals struggle with:
• Disorganization
• Lost context
• Missed opportunities.
They need a tool that automates organization and provides strategic insights, not just a static contact list.




I analyzed LinkedIn, internal phone contact organization, Instagram, HubSpot, Airtable, and Tiktok each offered parts of the solution but lacked context-aware follow-ups and AI-driven organization. This revealed a gap for a tool like Contacty, built specifically for personal, intelligent networking.
The user journey starts when a contact is captured through a scan, tap, or manual entry and Contacty immediately enriches the profile with relevant context. From there, the user can easily search, tag, and organize connections without any manual sorting. As career goals evolve, Contacty suggests who to follow up with and how, helping users move from collecting contacts to building meaningful, strategic relationships over time.

The system begins when two users initiate a connection via any of the three modes available on the application. That request is sent to AWS, which relays the task to a central database server. The database interfaces with external APIs like LinkedIn, Twitter, and others to fetch relevant contact and contextual information. This enriched data is compiled and returned through AWS in a simplified, digestible format, which is then displayed on the user’s device for seamless, informed networking.

While developing the system, I focused on integrating AI with a custom social media scraping algorithm I developed to surface relevant connections. Using ZeroWidth, I tested how large language models could interpret user goals through prompt engineering, relevance scoring, and AI-generated messaging, creating a smarter, more personalized networking experience.
I started with low-fidelity wireframes to lay out the core flows: capturing a contact, viewing enriched profiles, and receiving AI-driven follow-up suggestions. I focused on minimizing user effort, ensuring that adding a contact took no more than three taps and that smart recommendations felt accessible, not intrusive. These early wireframes helped test navigation logic, screen hierarchy, and how users might search or filter contacts before moving into more detailed visual and interaction design.

My interaction and design iteration journey was guided primarily by user testing, where I observed users complete tasks. By observing moments of hesitation, confusion, and delay, I was able to map out where users instinctively look for certain features and what the primary actions they completed on the app were. With this information, I was able to re-orient hierarchy in ways that favored organization over instant contact collection, resulting in a layout that prioritizes organization and strategization tasks.

Easily navigate between the applications' core functionalities through a vibrant interactive elements.
Providing NFC, contact form and ID OCR technology, Contacty provides you with a one stop shop for however you want to network.
Conveniently store your customized ID in your Apple wallet for easy custom information sharing, supplementing existing apple information sharing mechanisms by providing more custom sharing options.
Powered by a custom scraping algorithm and ZeroWidth LLM, receive actionable recommendations for achieving your networking goals most efficiently.
Using our "infinite tag" feature, find any contact you have made in the past through notes, emojis, tags, pictures, date, profession or event.
Customize both the information shared through your virtual ID as well as what your ID looks like to give you a dynamic experience.

We ran usability tests using high-fidelity Figma prototypes with 186 users from our target audience. Each participant was asked to complete tasks such as capturing a new contact, searching for a past connection, and acting on an AI-generated follow-up suggestion. Key insights:
Building Contacty has been one of the most rewarding challenges I’ve taken on. Designing a tool that turns something as messy and human as networking into a clear, strategic process forced me to balance technical feasibility with real-world needs. Working at the intersection of product design, AI, and systems thinking pushed me to grow quickly, and made every iteration feel meaningful.
*Contacty was formerly known as "Linky".





I’m incredibly grateful to have worked alongside an ambitious and thoughtful team. Thank you to Jasmine Meziou, Javier Araiza, Koka Gugunava, Carmen Rodríguez, Facundo Kim, Patrick Jun, and Daniela Guerra for your insight, late nights, and commitment to making Contacty real. Presenting our work at the UC Berkeley Haas School of Business as part of a startup incubator was a full-circle moment, it helped validate what we were building and reminded us of the impact this could have beyond the classroom.
As we continue developing Contacty, we’re focused on:
1. Expanding AI recommendations with broader datasets
2. Integrating with and calendar tools
3. Launching a public beta to gather deeper user feedback
Grateful for how far we’ve come, and excited about what’s ahead!
Role
Product Designer, System Architect, Field Research Lead
Timeline
August 2022 - December 2024
Tools
KoboToolbox, Garmin eTrex GPS, OpenStreetMap (OSM), QGIS, PostGIS, Twilio SMS API, RapidPro
Teammates
Community volunteers ③
Refugee informants ②
Humanitarian workers ②
Telecom SMS engineers ①
Burundian refugees moving between Tanzania and Burundi often travel without maps, internet, or verified information, relying on word-of-mouth while navigating dangerous, unmarked routes. This reflects a broader global migration challenge: the absence of reliable, low-tech systems to communicate real-time safety along human migration paths.
We built a geofencing-based SMS alert system for Burundian refugees moving between Tanzania and Burundi, achieving 92% delivery success on feature phones. It serves as a potential replicable model for low-tech, migration-focused communication systems globally.
We conducted 48 interviews with Burundian refugees, 4 focus groups in Nyarugusu camp, and 12 additional interviews with humanitarian workers and volunteers, which helped us uncover communication barriers, route knowledge gaps, and the heavy reliance on word-of-mouth among those migrating across the Tanzania–Burundi corridor.

Due to the widespread ownership of mobile phones, and their existing use during migration for communication and exchanging vital route information, it became clear that phones offered several relevant affordances. The following quotes from refugee interviewees further affirm this insight:

Location zones were created collaboratively using Garmin eTrex devices and verbal mapping from refugee and aid workers. Points were logged, validated, and geofenced using QGIS and PostGIS. To address privacy and security risks:

Refugees and volunteers contributed to mapping safe and danger zones by sharing local route knowledge and GPS waypoints, which I helped translate into structured geofence zones using QGIS.
Through interviews, paper maps, and in-field validation, we collaboratively identified high-risk and aid-rich locations. These became the foundation of our alert system and ensured our mapped zones reflected lived reality on the ground.





.png)

Role
AI Education Facilitator– User Research & Engagement Lead
Timeline
March 2021 — April 2021
Tools
Figma, Chimera Painter, Photoshop, Google Slides, Miro
Teammates
ML Engineer (Google AI)
Festival Participants (300+)
Creative Technologists







Role
UX & Interaction Designer
• Designed scan-to-add feature, store mapping, and gesture recognition
• Drove user interviews, in-store testing, and iteration
• Partnered on backend product database integration and gesture workflows
Timeline
August 2025 - January 2026
Tools
Figma, Snap Lens Studio, OpenAI APIs, Gemini Live (vision), MongoDB, Notion
Teammates
• Isabella Wang
• Alistar Xiao,
• Cody Qiushi Chen,
• Edna Ho
•Aarya Harkare
•Katherin Velazquez
(Software Engineers, UX Researchers, Product Managers)
Clients
Snap Inc.
Sephora
Retail shoppers struggle to get quick, trustworthy product context while browsing shelves especially for items with subtle differences (e.g., fragrances, cosmetics). Phone-based tools interrupt the physical experience and require cognitive switching, and existing in-store signage is static and generic. Ultimately, purchase confidence is low and product returns at Sephora stores is high.
We delivered a wearable AR shopping assistant built on Snapchat Spectacles for Berkeley, CA Sephora stores. Backed by over $30k from Snap Inc., the system brings product information, reviews, and comparisons directly to the shelf using vision, voice, and gestures, helping shoppers make decisions with confidence, without pulling out their phones.
The project began by visiting Sephora retail stores in Berkeley, CA, observing shopper behavior, and speaking directly with people while they browsed. I focused on moments of hesitation, comparison, and uncertainty.
Key insights:
1. Shoppers often want confirmation rather than deep research.
2. Phones interrupt the browsing rhythm.
3. Wearable interactions must feel lightweight and optional.
These insights pushed the design away from dense overlays and toward glanceable, on‑demand interactions.
After identifying Sephora shopper needs and pain points, I worked with fellow experience designers to pinpoint where wearable AR could add value in the in-store journey. We evaluated the affordances of Snapchat Spectacles alongside lessons from AR retail and wayfinding systems such as Amazon Go and Standard Cognition. This phase also included returning to stores to introduce shoppers to the Spectacles and gather feedback on comfort, social perception, and when AR support felt helpful versus intrusive.
I identified that the Spectacles could be a useful intervention if the interaction principles were grounded in:
1. Keeping interactions hands‑free and glanceable.
2. Give users control over when information appears.
3. Design for imperfect conditions such as noise, lighting, and shelf clutter.
With research insights in place, we moved into alignment and negotiation with Snapchat and Sephora to define what could realistically be built, tested, and deployed within platform and retail constraints. These conversations helped narrow the scope to interactions that were technically feasible, socially acceptable, and valuable to both the platform and the retail context. With the two clients, the developed the user journey is as seen below.
Our initial interface used bright, high-contrast visuals to draw attention and stand out against Sephora’s shelves. User testing quickly showed the opposite effect: the colors distracted from the products, clashed with the store environment, and made the AR feel disconnected from the in-store experience shoppers valued. Based on this feedback, I pivoted to a more minimal, transparent interface that supported presence rather than competing for attention.
In the second iteration, the interface was redesigned to visually recede into the environment. Instead of asking users to focus on the AR layer, the overlay was treated as a subtle augmentation of the shelf itself. Text was minimized, color was used sparingly, and transparency allowed the physical product to remain the primary visual anchor. This shift aligned more closely with why users came into the store in the first place: to see, feel, and experience products in person, with AR acting as quiet support rather than the center of attention.
Lens Studio was still a new platform with limited documentation, so much of the work involved defining patterns as we built. I led the design and development of item recognition and checkout, grounding the flow in familiar scan-to-pay interactions. Through iteration and in-store testing, barcode scanning proved to be the most reliable AR checkout approach. In parallel, I designed a navigation system that surfaced only relevant labels based on eye line and head movement, using a cascading, dismissible layout that kept focus on the shelves rather than the interface.
I designed gesture recognition based on how shoppers naturally move in Sephora, such as hovering while deciding, sliding products to compare, or stepping back to scan a shelf. These behaviors informed simple, low-effort gestures for revealing details, comparing items, and confirming actions. The gesture set was grounded in in-store observation and established research on embodied interaction and public gesture usability, and was iteratively tested in-store to ensure it felt intuitive and socially comfortable in crowded aisles.
Testing was conducted through repeated in-store demos and short guided trials with shoppers using live Spectacles prototypes. We observed how quickly users understood each interaction, where they hesitated, and when they disengaged or reverted to their phones. Feedback from these sessions directly informed iteration cycles, including simplifying gesture sets, reducing on-screen text, adjusting overlay timing, and refining scan-to-add reliability under varied lighting and shelf conditions. Each iteration was validated back in the store, ensuring changes improved confidence and flow in real shopping environments rather than controlled lab settings.
This project highlighted how much care is required to introduce new technology into everyday, shared spaces. The strongest outcomes came from designing with restraint, allowing AR to support existing shopping behaviors rather than compete with them.
I’m deeply grateful to my teammates on this project for their collaboration across research, design, and engineering, and to our partners at Snapchat and Sephora for their trust, support, and openness to experimentation. We are continuing to iterate on the experience through ongoing user testing, with a focus on improving spatial reliability, refining interaction comfort, and validating longer-term impact on shopping confidence and behavior.