Sonar captures human behavior and opinions expressed through observed interactions, quotes, sentiment, and emotion. Supported by multiple AI categorization models Sonar consists of sliced and diced customer recordings in labeled and information-rich Experience Nuggets.
What is Experience Data?
Each nugget in your experience data is a combination of a customer quote or observation with keyword tags, sentiment or emotional reaction and a video snippet as evidence. Sonar enables you to search and filter across your Experience Data repository to identify patterns, assemble playlists and create and share insights.
Insights based on Evidence
An Insight in Sonar is a collection of analysed Experience Data with detailed descriptions and automated Data Visualizations. With analyzed, processed and aggregated experience data, Insights enable you to make better decisions by providing evidence of human behaviour and experiences. In Sonar you can act on Insights by sharing them with any member of your team account.
Three AI Powered Categorization Models
Whether you upload data yourself or you leverage our end-to-end platform, all customer recordings are transcribed using our fully-automated multi lingual speech-to-text model. To further categorize what your customers say and do, we have built three AI powered categorization models: Keyword Tags, Sentiment and Emotional Reaction.
Keyword Tags
A Keyword Tag is simply a Keyword within an Experience Nugget. Keywords add context to a Quote or Observation, and even more importantly make them searchable across customers, studies, and markets.
Sentiment
Our AI model’s automated sentiment analysis scans through the transcript to detect and categorize quotes as negative, neutral or positive. The corresponding Experience Nugget is then labeled with that information enabling you to analyze sentiment across all your CX data.
Emotional Reaction
Our Emotional Reaction AI model uses the camera feed of customer recordings to capture their emotional reactions when they watch your marketing material or interact with your product or prototype. The model identifies their emotional reaction based on facial expression, and adds their reaction as a label, and emoji 😊, to a corresponding observation.
Next steps if you want to learn more