KAI & CURAI
Our challenge was to examine the poetics of interaction in the context of AI counseling, exploring it from the perspective of both clients and therapists.
The use of AI continues to grow as it offers efficient solutions to problems facing people and businesses, and quick ways of accessing relevant information. However, it’s use is now also expanding beyond task oriented interactions into those that are imprecise, abstract and emotional.
“For users, conversational AI offers for the first time a means to interact with technology using their own words. For technology to understand them, not the other way around.” — Andy Peart, Chief Marketing & Strategy Officer at Artificial Solutions.
We began the research process by exploring the possibilities and current limitations of artificial intelligence and counseling with a focus on interviewing counsellors, research papers, and market studies of existing products. Our exploratory questions included:
How can we provide a solution social, financial, and temporal constraints of therapy?
A key finding was the lack of professional care received by individuals suffering from mental health issues. This was marked by social stigma attached to mental health and therapy, high costs of treatment and a shortage of therapists to provide professional care to individuals. Approximately 56% of all Americans suffer form mental illness and do not receive treatment.
What does care look in the context of AI and grief?
While psychological help may be required for a range of mental health problems, we chose to focus on the context of grieving individuals and the baggage of processing the loss of a loved one. Research suggested that while grieving individuals look for support and comfort, they may find it hard to share these feelings with even their closest friends and family. This provided additional opportunity to explore role of "AI as an outsider".
How can we leverage AI as a low threshold entry point for counseling?
Our team believes strongly that AI cannot replace human connection. However, it is interesting to note it’s role as an outsider. Studies conducted at CMU suggest that individuals display lower levels of self-restrain and find it easier to initiate conversations and speak to an AI, in comparison to human therapists. In addition, AI can provide a cost-effective solution to some of therapy’s financial and temporal constraints.
How might we coalesce verbal and non-verbal interactions to improve conversational experiences in the context of therapy?
Our research on current AI therapists and voice assistants revealed several actionable opportunities. Most of the popular AI therapists on the market like Woebot and Replica presently build on a chat-based interface which does not fully leverage the potential of voice or visual interactions. On the other hand, voice assistants like Siri and Google Home which make use of verbal and visual feedback are constrained by a strong utilitarian focus on performing tasks while their visual language is limited by their functionality.
Universal Metaphors for an Abstract Visual Form
Based on our research, we found visualizations of AI counselors to range on a wide spectrum of android-like robotic forms, abstract spinning dots as well as graphical renditions of human forms. We believed it was imperative to steer away from the uncanny valley of AI representations. We chose to build on an abstract visual representation that would feel approachable and relatable.
Artificial intelligence counselors — range of formal abstraction:
The metaphors of water and spherical entries provided a cross-cultural reference recognized universally as symbols of calmness and continuity. Water, understood across a range of cultures, symbolizes calmness, fluidity and clarity and the sphere acts as a symbol of continuity, healing and balance. Water as a material and the sphere as a concrete entity determined the visual metaphors of Kai's formal representation.
Colors of Calm and Care
Color theory can be subjective. Hence it was important to user test our color explorations in order to inform our process. Purple unanimously emerged as a color for care, recognized across cultures as a symbol for grief that worked well as the primary color. Pairing it with a warm yellow brought in a quality of calmness, that fit in well with our previously defined design objectives.
A Human Voice Designed for Empathy
As voice designers, working in the context of a conversational AI, the tone of voice becomes a critical touch-point. In the past few years there has been an upsurge in the technology that brings in a humanistic quality to AI voices, which we seeked to leverage. A humanistic voice rendered with comfort and care brought in a warm balance in contrast to our abstract visual form, making a holistic experience which was built to soothe and comfort.
Motions for Non-Verbal Human Interactions
A key component of the motion design for us was to build on a vocabulary of non-verbal human interactions. This required us to push the traditional boundaries of listening and speaking states of conversational AIs to accommodate for pauses and prompts that are a part of our natural conversations. We studied the affordances of water as a material to experiment with variables of frequency, amplitude, motion and transparency. User testing our designs, we understood the importance of maintaining a symmetrical motion to maintain an approachable quality to the form.
DAP – Data Assessment Plans
The DAPs framework is used by many therapists to keep track of a patient's progress at each appointment. Sessions are recorded in notes that usually follow the format of Data (what is being said by the patient), Assessment (the therapist’s assessment of the patient’s behavior) and Plans (setting of long-term and short-term plans from therapy sessions). The DAPs framework formed the basis of wireframing the counselor's dashboard.
Each individual’s life is built on a myriad of relationships. These relationships sometimes are key to understanding the psychological factors behind a patient’s current state of mind. We discovered an interesting graphical representation used by therapists as a way of mapping and analyzing detailed data of an individual’s relationships: Genograms. Genograms became a great starting point for our exploration of this type of data visualisation that is already very familiar to therapists.
When asked about how AI might help in therapy, most counselors spoke strongly in favor of the need to transcribe the conversation from sessions. This insight informed a key feature of the dashboard, which would significantly improve the counselor’s experience during the session and allow them to focus on the patient's assessment.
We designed the main page with a focus on components that counsellors need to access right before or after a session. Building on the DAPs framework, the page represented data from the last session in form of quotes and assessment as well as plans in form of notes. Additionally, we designed a central timeline that used colors to distinguish between Kai (purple) and clinic sessions (mint) to allow therapists to easily scrub through any session of the past and to learn more about the client's progress.
Building on genograms, we introduced the feature of thematics as an overview into a patient’s relationships. Using lexical and concordance formats to structure this data, we envisioned this as a map of most common subjects from the history of the conversation and their co-relations, distinguished as important people and most prevalent themes in the patient’s life. This would allows therapists to view patterns in patient’s relationships and how these topics were interconnected. We designed an interactive visualisation which depicts frequency of use through the size of circles, positive (green) or negative (red) relationships through the use of stroke colors, and saturation to represent the intensity of such relationships.
Access to Session Transcripts
Categorized under keywords, at each point the therapist has access to the relational quotes that were said and to the full transcripts from the corresponding sessions. Accommodating for any AI-error, this gave the therapists the agency to check and interpret the conversation for themselves.
Using our insight for transcribing therapy sessions, we leveraged the orientation of the iPad screen from horizontal to vertical as a way to switch into note-taking mode during the session. Accessed through a simple swipe, the note-taking section is divided into separate pages for notes and plans. The function to record specific quotes is mapped to when the therapist writes down the (") symbol. These quotes along with other assessment notes and plans are then automatically updating on the dashboard at the end of the session.