CONVERSATIONAL UX DESIGN
Building Empathy into AI:
How We Designed a Voiceflow Chatbot for Autism Support
🧭 The Problem:
Parents feel Overwhelmed and Alone while searching for answers
When a child is first diagnosed with autism, parents often find themselves suddenly thrust into a world filled with jargon, fragmented resources, and impossible decisions. Between late-night Googling, closed Facebook groups, and trying to make sense of the special education system, most parents end up asking the same thing:

Background
Femmecubator is a Civic Tech and data driven non-profit based in NYC. I was onboarded as a Sr. UX Designer to assist their AI Chatbot initiative in collaboration with UC San Diego's Center for Autism. This project aims to serve 240k+ children in Southern California, and expand to 2M+ affected families across the United States.
Role and Responsibilities
Led UX research (interviews, empathy mapping, persona synthesis)
Created landing page IA and wireframes
Mapped Chatbot architecture, persona and created chatbot prototype in Voiceflow,
integrating NLU intents and flows
We set out to design something simple:
A chatbot you could just text your scenario to, and it would help you understand your options.
Without judgment, without overwhelm.
They don’t need another 50 page guide to read through and the bot would reply with information relevant to their state, their child’s age, and their emotional state. They need a calm, trustworthy voice that listens and gives them answers tailored to their situation.
Over six weeks, I led the UX research and design of this AI-driven chatbot and its companion landing page. Every decision, from tone to layout was rooted in real stories from parents. We used research insights to guide the content structure, flow logic, and even how the bot responds when someone’s feeling overwhelmed.
Key Design Decisions
1. NLU for responses, LLMs for sentiment
We built the chatbot using structured Natural Language Understanding (NLU) and used Large Language Models (LLM) to generate introduction texts, match voice & tone for user queries.
2. Emotionally Adaptive
The chatbot was designed to recognize emotional cues such as stress, confusion, or urgency and adjust its tone accordingly, thanks to LLMs!
3. Building an extensive Knowledge Base
Instead of offering generic advice, our chatbot delivered precise, relevant information drawn from sources verified by our team at UC San Diego to help parents find answers that truly matched their child's needs and circumstances.
The result:
A working prototype built in Voiceflow that feels more like texting a helpful friend. Focused on empathy, built with NLU, and designed to meet parents where they are.
Next Steps:
Test prototype with more parent users
Launch website and roll-out beta version on website
Integrate Twillio APIs for SMS, WhatsApp, Messenger
It isn't easy to build AI Chatbots that deal with complex emotions
60% of the project was defining content blocks since we constantly kept missing utterances, or failed to map the right emotions and user sentiments.
I learnt how crucial tone, transparency, and giving users control are in helping people navigate stress and uncertainty.
Reach out to your peers!
This project pushed me to connect with Conversation Designers across my network to help me troubleshoot any issues I had, or simply to learn best practices and processes.
I now have a large community to lean on and build with that I am very grateful for!









