Turning AI into a Conversation: My Design Journey in Shaping Sagen AI’s Visual Personality

PROCESS HIGHLIGHTS

Tackling the Challenges and Shaping the Experience

Overview

Sagen AI is a customer service platform powered by conversational AI. What makes it different? It introduces interactive, customizable characters that can talk and respond in real-time, like chatting with a digital human, not a bot.

Timeline

2023 - 2024

Tools

Live2D
Figma
Adobe Photoshop
Slack
GitHUb

Responsibilities

  • Character animation (viseme-based lip sync, expression, and movemment)

  • Designed customization features and character library

  • Assisted with UI/UX layout and interaction design

  • Collaborated with art director & engineers

BACKGROUND

From Cold to Conversational

Customer service often feels cold and robotic, especially in chatbot interactions. Sagen AI aimed to change that by merging generative AI with personality. The goal was to create a product where customers wouldn’t just get answers, they’d feel like they were having a conversation. The team observed how users gravitate toward character-rich apps, emotional design, and personalization. That inspired the character-driven approach.

CHALLENGES

The Bumps in the Road

Making the Characters Feel Alive

Creating real-time viseme animation (lip sync with AI-generated speech), facial expression and gesture was a technical and design challenge. We needed the characters to feel alive, without becoming uncanny or distracting.

My approach: I studied speech patterns and facial motion timing, then mapped phonemes to expressions frame-by-frame, adjusting for latency. On top of that, I had to learn Live2D, a new software, to bring the characters’ facial expressions and gestures to life, making sure the animations felt natural and engaging.

Why 2D, not 3D?

While our characters had 3D inspired details, we intentionally chose a 2D style to optimize performance. 3D models would have increased load times and required heavier rendering, which could frustrate users, especially on mobile. By using 2D, we struck a balance between visual appeal and fast, responsive interaction.

Customization That’s Fun, Not Overwhelming

We wanted to offer a wide range of customization (like The Sims) where user can make their own character, but not everyone is a designer. How do we balance depth and simplicity?

My solution: I helped modularize the design into categories (hair, outfit, emotion, voice), designed icon-based navigation, and added quick preview interactions.

Responsive UI for Both Web & Mobile

Making sure the experience stayed seamless across platforms required UI layout adjustments and interaction tweaks.

My support: I helped adapt navigation hierarchies and component responsiveness in Figma, based on device testing.

Problem Discovery

While working on the project, we encountered problems that affected internal workflows. Our testing process was slow and dependent on multiple teams. Designers had to wait for frontend and engineers to push builds to the testing web before we could evaluate animations or interactions, causing delays.

  • Introduced personality presets for quicker emotional connection.

  • Added micro-interactions and idle animations to boost expressiveness.

  • Worked with engineers to develop an internal Sagen Editor so designers could test animations and behaviors. directly, without waiting on frontend pushes. This significantly cut revision time and improved iteration speed.

REFLECTION

Designing, Then Discovering

This project taught me how technical animation, emotional design, and user interaction can blend into one cohesive experience. I learned to translate abstract problems like, “How can a chatbot feel human?” into practical, creative solutions.

It also sparked a deeper interest in the world of digital design, especially how every aspect can work together to create engaging and human-centered experiences. This project made me want to explore more tech-driven design challenges in the future. Looking ahead, I’d love to explore more adaptive character behaviors, like visual cues that react to user tone or real-time sentiment. Small emotional feedback loops like these could make future interactions feel more natural, expressive, and human.

This project taught me how technical animation, emotional design, and user interaction can blend into one cohesive experience. I learned to translate abstract problems like, “How can a chatbot feel human?” into practical, creative solutions.

It also sparked a deeper interest in the world of digital design, how they can work together to create engaging and human-centered experiences. This project made me want to explore more tech-driven design challenges in the future. Looking ahead, I’d love to explore more adaptive character behaviors, like visual cues that react to user tone or real-time sentiment or small emotional feedback loops to make future interactions feel more natural, expressive, and human.

Credit
  1. Viva Chu - Team Leader

  2. Dicky Abdurachman - Project Manager

  3. Kus wanto - Art Director

  4. Erri Kuswadi - Unity Developer

  5. Erwin PS - Unity Developer

  6. Aldi Daswanto - Frontend Developer

Reach out, hop on,
and let’s see where this ride goes.
Reach out, hop on,
and let’s see where this ride goes.
Say hi!
yolanda.emmatyaa@gmail.com

Copyright © Yolanda Emmatya

Based in

Jakarta, Indonesia

Based in

Jakarta, Indonesia

Available for work

Freelance, Full-time