Utilize AI facial recognition alongside brain data to get data around emotional and cognitive state, providing a scientific  basis for your products Aha Moment.

Every great SaaS product, design, or marketing leader knows that to captivate new customers their product needs to be sticky by providing continuous Aha Moments throughout the journey. Unfortunately, traditional software and analytics cannot determine where this occurs, as it is a state of the human, and not of the software. This is where Pericror comes in, solving for the analytics on the human side.

We use two streams of analysis to identify the aha moments, beginning with AI based image recognition and facial analysis which can reveal heartrate and emotion state.

We layer onto this data additional insights from an Emotiv headset, which provides brain data that we can baseline to happy and accomplished states.

Our aha moment detection service

Our Aha Moment detection service is an end to end solution providing actionable insights to product teams,. We specialize in SaaS solutions and helping identify the best and worst moments of human centered software.

g

Recruitment:

We begin by conducting a recruitment of individuals for the study. These individuals are not specialized and represent a ‘non expert’ view of your software.

g

Setup & Use:

We handle all the physical infrastructure around the actual lab environment for the collection of brain and image data. We train baselines for each respondant meaning our results have high confidence.

g

Insights:

We provide you with the insights on the specific points in your users journey, and the specific screens and actions they completed which resulted in their optimal aha moment. As a extension of this service, we can also identify stress/dissatisfaction points within your products that should be refined for better user response.

HOW WE CALCULATE AHA MOMENTS

Aha Moment detection is an objective state where the user is happiest and feels accomplishment and satisfaction in using your product. Since these are internal states to an individual, traditional software measurements around clickpaths and usage cannot collect this data. AI processing of the video data of the users expressions, alongside brain data, provide the needed insights to detect these moments objectively.

g

Emotional Processing

Our AI conducts an emotional analysis based on the micro reactions detectable on the face. We also can identify flushing of the skin to indicate heartrate, allowing a physiological data point on the participant throughout the product usage.  

g

Brain Data Analysis

In addition to visual data, we process brain data through use of the Emotic  SDK. We train and baseline on happy, satisfied, irritated, and frustrated states so that we can objectively detect re-occurance of these patterns during product usage.

We look forward to driving success into your business.

14 + 3 =