Six years shipping first-of-kind camera and photo features on Galaxy — from a blank canvas to two billion interactions.
View selected workI think about how things look and how they feel to use — and I don't think those are separate questions.
I'm a Lead Product Designer at Samsung R&D Institute India, where I've spent six years making AI feel natural on a phone. My background is in Fine Arts (BFA, Delhi University) and Interaction Design (M.Des, IIITDM Jabalpur) — which gives me an unusual lens: I care as much about the aesthetic quality of a moment as the usability of it.
The work I'm most proud of sits at a hard intersection — where cutting-edge AI capability meets the everyday person who just wants a beautiful photo. I've shipped Photo Assist (the first generative photo editing on any smartphone), Instruction-Based Editing, AI Portrait styling, and a draw-to-generate camera experience. I was promoted to Lead in 2024 and was one of a select few Bengaluru designers chosen for an on-site residency at Samsung UX HQ in Seoul.
Before Samsung, I designed B2B enterprise products at Infosys for BHP and Deutsche Bank — where I learned how design thinking can unlock commercial value, not just better UX. And before that, internships at Cafe Coffee Day, Kamalan and Faagio taught me the humbling basics of designing for real users with real constraints.
The first commercial generative photo-editing feature on any smartphone, launching on Galaxy Z Flip6 and Fold7. Designed the full experience — discovery, application, preview and saving of AI-generated edits.
Generative AI photo editing didn't exist on any smartphone. The challenge wasn't just building a feature — it was defining what this category should feel like on a phone. How do you make a complex AI capability feel effortless in a tool people use every day?
The stakes were high: this would be Samsung's flagship AI differentiator for the Galaxy Z Flip6 and Fold7 launch — and the first of its kind on any phone, anywhere.
Lead Product Designer — owned the end-to-end UX for Photo Assist, from early concept through to production. Worked directly with the Seoul-based tech team to understand model capabilities and constraints, then translated those into a user experience that felt intuitive rather than technical.
The design process centred on a core tension: the AI could do a lot, but users shouldn't have to know that. I designed the full interaction arc — how users discover the feature within the Gallery, how they apply an edit, how they preview the AI-generated result alongside the original, and how they save or iterate.
Close collaboration with the Seoul tech team was critical. The model had constraints — certain types of edits worked reliably, others didn't. Rather than exposing every capability, I worked with engineering to curate the experience, surfacing what the AI did well and gracefully handling edge cases.
The hardest design decision was what to leave out. A generative model can do a hundred things — the job was to pick the ten that would feel magical and hide the ninety that would feel broken.
Discovery within the Gallery — Photo Assist lives where photos already live. Rather than asking users to open a separate editing app, the feature is surfaced contextually when browsing photos, reducing friction to near zero.
Preview-first interaction — Users see the AI-generated edit alongside the original before committing. This builds trust in the AI and gives users a sense of control over an otherwise opaque process.
Constraint-led curation — Instead of exposing the full range of the generative model, the UX presents a curated set of edits matched to the specific photo. This kept the experience reliable and prevented the "uncanny valley" moments that erode user trust in AI.
First of its kind — launched as the first commercial generative photo-editing experience on any smartphone, debuting on the Galaxy Z Flip6 and Fold7.
Employee of the Year — awarded Samsung SRIB's highest individual recognition for the impact and quality of the work.
Established the playbook — the interaction patterns and design principles from Photo Assist became the foundation for subsequent AI editing features, including Instruction-Based Editing on the Galaxy S26.
Co-led UX for a natural-language photo editing system on Galaxy S26 — letting users edit images by describing what they want. Part of a unified visual experience strategy with features extending to Fold8.
Photo Assist proved users wanted AI editing. The next question: what if users could just describe what they want? Designing a natural-language interface for photo editing that felt intuitive to casual users while being powerful enough for those who wanted precision.
Co-lead Product Designer — shaped the interaction model for instruction-based editing and AI-suggested edits. Extended the design foundation laid by Photo Assist into a more expressive, language-driven paradigm. Also contributed to the broader "Capture → View → Edit → Manage → Share" experience strategy.
Shipped on Galaxy S26. Remaining features slated for Fold8 — a multi-cycle product shaped from the ground up.
Two firsts for Galaxy. Portrait AI — first mobile feature to retain facial identity during style transfer. Sketch-to-Image — Galaxy's first draw-to-generate experience. Selected for a residency at Samsung UX HQ in Seoul.
Two distinct challenges, one shared principle: make AI generation feel personal, not generic. For Portrait AI, the core problem was style transfer that doesn't destroy what makes a face recognisable. For Sketch-to-Image, turning rough drawings into polished visuals without making the user feel like the AI did all the work.
Lead Product Designer — designed the prompt systems for Portrait AI, scaling across 12+ visual styles while preserving facial identity. For Sketch-to-Image, designed the drawing-to-generation flow.
Selected as one of a small number of Bengaluru designers for an on-site residency at Samsung UX HQ in Seoul to co-develop both features directly with the research team.
Also developed a cross-cultural design framework for AI face features — addressing global facial markers and cultural sensitivity — adopted by the Seoul research team as standard protocol.
A partnership feature embedded into Galaxy A-series cameras. Designed the integration UX and lens qualification guidelines. The feature crossed 2 billion interactions.
Embedding a third-party experience (Snapchat Lenses) inside Samsung's native camera without it feeling foreign. The A-series audience skews younger — the experience had to feel native to Samsung while leveraging Snapchat's creative engine.
Designed the integration UX and established safe-zone guidelines for lens qualification. Proposed two features — Favoriting Lenses and Personalised Lenses — both adopted into the next release cycle.
Surpassed 2 billion cumulative interactions. The two proposed features shipped in the subsequent release, proving the design thinking extended beyond the initial launch.
Five generations of Samsung Camera UI across One UI — owning visual specs, GUI/UI guides, and production handoffs with offshore engineering in China. Spans 6 form factors across the full Galaxy lineup.
Samsung Camera ships across S-series, Fold, Flip, Triple Fold, and Tablets — each with different screens, aspect ratios, and hardware. Maintaining a coherent, high-quality camera UI across all of them, over five OS generations, is an exercise in systems thinking.
Owned the full production pipeline: visual specs, GUI/UI documentation, design system guides for A & M-series, testing, and offshore dev handoffs with China. In parallel, redesigned the Magnifier accessibility app across all six form factors and defined the Expert Raw Sky Guide design system — a visual language for Samsung's pro photography app covering planets, stars and constellations.
Three AI-native concepts explored over two years. None shipped in their original form. But they shaped what came after.
Three separate innovation explorations: an AI video editing suite aimed at making pro-level video edits accessible, an AI photo editing pipeline that predated Photo Assist, and a Creator's Camera mode designed for content creators who needed quick, polished output.
None shipped as originally conceived — but they weren't failures. The AI photo editing pipeline directly informed the design principles behind Photo Assist. The Creator's Camera exploration shaped Samsung's understanding of their pro-sumer audience. And the video editing work built a foundation of AI interaction patterns the team continues to draw on.
Showing unshipped work is a deliberate choice. These projects represent the kind of thinking that only comes from exploring without a guaranteed outcome — and that thinking is present in everything that did ship.