Click or drag to resize
Web API Documentation

Welcome to the Eyeris EmoVu Web API documentation. Here you can find information about consuming the Image and Video endpoints. Use the POST HTTP method to make all requests.

Image Endpoint

Use the image endpoint to process static images. All common image formats including .jpg, .bmp, .png, .tif, and .pgm are supported. Below are the optional request headers and form encoded parameters for this endpoint.

Request Headers

Name

Condition

Type

Range

Description

LicenseKey

Required

String

Contact us

A license key string provided by us.

Form Encoded Parameters

Name

Condition

Type

Description

computeAgeGroup

Optional

Boolean

True or False

A flag indicating whether to compute the age group metrics. Setting this value to False may speed up the algorithm. Default is True.

computeEmotions

Optional

Boolean

True or False

A flag indicating whether to compute the emotions metrics. Setting this value to False may speed up the algorithm. Default is True.

computeEyeOpenness

Optional

Boolean

True or False

A flag indicating whether to compute the eye openness measure. Setting this value to False may speed up the algorithm. Default is True.

computeGender

Optional

Boolean

True or False

A flag indicating whether to compute the gender metrics. Setting this value to False may speed up the algorithm. Default is True.

computeIdentity

Optional

Boolean

True or False

A flag indicating whether to compute the face recognition metrics. Setting this value to False may speed up the algorithm. Default is False.

computeMetrics

Optional

Boolean

True or False

A flag indicating whether to compute the engagement and mood metrics. Setting this value to False may speed up the algorithm. Default is True.

faceAnalysisType

Optional

String

AnalyzeLargestDetectedFace or AnalyzeAllDetectedFaces

Determines whether the algorithm will compute metrics for all detected faces or only for the largest face. Setting this value to AnalyzeLargestDetectedFace may speed up the algorithm. Default is AnalyzeLargestDetectedFace.

imageFile

Required

Binary

The input image. Can be either a color or grayscale image.

minimumFaceDetectionConfidence

Optional

Integer

Minimum of 1

The minimum confidence to accept a candidate face detection window. Takes an integer value with a minimum of 1. Default is 3.

minimumFaceWidth

Optional

Integer

1 to image width

Sets the minimum width of a face to be analyzed in pixels. Default is 20.

poseThreshold

Optional

Double

0 to PI/2

The maximum head pose radius to be processed in radians. This is computed as the angle between the optical axis of the acquisition sensor and the off-axis head pose vector. Default: PI/2 radians.

Below is a sample response and a description of each of the output variables.

A Sample Image Response (JSON)
{
  "FaceAnalysisResults": [
    {
      "AgeGroupResult": {
        "AgeGroup": "Adult",
        "Computed": true,
        "Confidence": 1.0
      },
      "EmotionResult": {
        "Anger": 0.003,
        "Computed": true,
        "Disgust": 0.033,
        "Fear": 0.001,
        "Joy": 0.954,
        "Neutral": 0.007,
        "Sadness": 0.0,
        "Surprise": 0.0
      },
      "FaceTrackerResult": {
        "FaceRectangle": {
          "Height": 137,
          "Left": 106,
          "Top": 93,
          "Width": 148
        },
        "HeadPose": {
          "Pitch": -4.968,
          "Roll": 7.977,
          "Yaw": -5.463
        },
        "TrackingId": 29359
      },
      "GenderResult": {
        "Computed": true,
        "Confidence": 0.994,
        "Gender": "Female"
      },
      "IdentityResult": {
        "Identity": -1,
        "Computed": false,
        "Confidence": 0.0
      },
      "MetricResult": {
        "Attention": 0.877,
        "Computed": true,
        "Expressiveness": 0.954,
        "NegativeMood": 0.033,
        "PositiveMood": 0.954,
        "Valence": 0.494
      }
    }
  ],
  "ProcessingTime": 227,
  "Tracked": true
}
Response Variables

Name

Type

Range

Description

FaceAnalysisResults

An array containing all of the facial analysis results for each detected face.

FaceAnalysisResult.AgeGroupResult

Encompasses the age recognition results.

FaceAnalysisResult.AgeGroupResult.AgeGroup

String

Child, Young Adult, Adult, and Senior

The predicted age group label.

FaceAnalysisResult.AgeGroupResult.Confidence

Double

0 (no confidence) to 1 (maximum confidence)

The age group prediction confidence.

FaceAnalysisResult.EmotionResult

Encompasses the emotion recognition results. The emotion recognition module produces an intensity measure for seven universal expressions. All expressions are normalized and sum to 1.

FaceAnalysisResult.EmotionResult.Anger

Double

0 (absent) and 1 (maximum intensity)

The anger intensity measurement.

FaceAnalysisResult.EmotionResult.Disgust

Double

0 (absent) and 1 (maximum intensity)

The disgust intensity measurement.

FaceAnalysisResult.EmotionResult.Fear

Double

0 (absent) and 1 (maximum intensity)

The fear intensity measurement.

FaceAnalysisResult.EmotionResult.Joy

Double

0 (absent) and 1 (maximum intensity)

The joy intensity measurement.

FaceAnalysisResult.EmotionResult.Neutral

Double

0 (absent) and 1 (maximum intensity)

The neutral intensity measurement.

FaceAnalysisResult.EmotionResult.Sadness

Double

0 (absent) and 1 (maximum intensity)

The sadness intensity measurement.

FaceAnalysisResult.EmotionResult.Surprise

Double

0 (absent) and 1 (maximum intensity)

The surprise intensity measurement.

FaceAnalysisResult.FaceTrackerResult

Contains the face tracking results including the head pose information.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle

The rectangle enclosing the largest face detected in the image.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Height

Integer

1 to the image height.

The height of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Left

Integer

0 to the image width-1.

The left-most x coordinate of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Top

Integer

0 to the image height-1.

The top-most y coordinate of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Width

Integer

1 to the image width.

The width of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.HeadPose

Contains the head pose measurement with respect to the acquisition sensor in Euler angles.

FaceAnalysisResult.FaceTrackerResult.HeadPose.Pitch

Double

-90 to +90

The pitch angle in degrees.

FaceAnalysisResult.FaceTrackerResult.HeadPose.Roll

Double

-180 to +180

The roll angle in degrees.

FaceAnalysisResult.FaceTrackerResult.HeadPose.Yaw

Double

-90 to +90

The yaw angle in degrees.

FaceAnalysisResult.FaceTrackerResult.TrackingId

Integer

A unique tracking id for each face that is persisted across frames. A new id is assigned to a face that enters the FOV.

FaceAnalysisResult.GenderResult

Encompasses the gender recognition results.

FaceAnalysisResult.GenderResult.Gender

String

Female and Male

The predicted gender label.

FaceAnalysisResult.GenderResult.Confidence

Double

0 (no confidence) to 1 (maximum confidence)

The gender prediction confidence.

FaceAnalysisResult.IdentityResult

Contains the face recognition results for the given face.

FaceAnalysisResult.IdentityResult.Identity

Integer

The identity id of the recognized face based on the provided gallery. Returns -1 if the face was not identified.

FaceAnalysisResult.IdentityResult.Computed

Boolean

True or False

A flag indicating whether the face recognition metrics were computed.

FaceAnalysisResult.IdentityResult.Confidence

Boolean

0 and 1

The identity prediction confidence.

FaceAnalysisResult.MetricResult

Contains engagement and emotion metrics that are derived from the emotion recognition result and head pose information.

FaceAnalysisResult.MetricResult.Attention

Double

0 (no attention; the head pose vector is orthogonal to the sensor's optical axis) and 1 (full attention; the head pose vector is parallel to the sensor's optical axis)

The attention measurement. This value indicates how close the subject's head is facing towards the acquisition sensor.

FaceAnalysisResult.MetricResult.Computed

Boolean

True or False

A flag indicating whether the metric result was computed.

FaceAnalysisResult.MetricResult.Expressiveness

Double

0 to 1

Also referred to as the "interaction" metric, equals the highest value of the dominant emotion at that frame, or second dominant value in case Neutral for that frame is the dominant emotion.

FaceAnalysisResult.MetricResult.NegativeMood

Double

0 to 1

Calculated as the maximum negative emotion intensity. The negative emotions are Anger, Sadness, Fear, and Disgust.

FaceAnalysisResult.MetricResult.PositiveMood

Double

0 to 1

Calculated as the maximum positive emotion intensity. The positive emotions are Joy and Surprise.

FaceAnalysisResult.MetricResult.Valence

Double

0 to 0.5

Calculated as the average of the maximum negative emotion intensity (includes Anger, Sadness, Fear, and Disgust) and the maximum positive emotion intensity (includes Joy and Surprise).

ProcessingTime

Integer

The processing time in milliseconds.

Tracked

Boolean

True and False

A flag indicating whether a face was tracked in the image.

Image Frame Endpoint

Use the imageframe endpoint to extract age, gender, emotion, face recognition results and more from an image sequence. The api response from the previous frame should be provided as an input parameter to process the current frame in order to avoid invoking the costly face detection process.

Request Headers

Name

Condition

Type

Range

Description

LicenseKey

Required

String

Contact us

A license key string provided by us.

Form Encoded Parameters

Name

Condition

Type

Description

ageGroupComputeInterval

Optional

Integer

Minumum of 0

This value determines the processing interval for computing a subject's age group in milliseconds. For instance, if set to 1000 an AgeGroupResult will only be produced every one second. Set to zero to compute an AgeGroupResult for every frame.

computeAgeGroup

Optional

Boolean

True or False

A flag indicating whether to compute the age group metrics. Setting this value to False may speed up the algorithm. Default is True.

computeEmotions

Optional

Boolean

True or False

A flag indicating whether to compute the emotions metrics. Setting this value to False may speed up the algorithm. Default is True.

computeEyeOpenness

Optional

Boolean

True or False

A flag indicating whether to compute the eye openness measure. Setting this value to False may speed up the algorithm. Default is True.

computeGender

Optional

Boolean

True or False

A flag indicating whether to compute the gender metrics. Setting this value to False may speed up the algorithm. Default is True.

computeIdentity

Optional

Boolean

True or False

A flag indicating whether to compute the face recognition metrics. Setting this value to False may speed up the algorithm. Default is False.

computeMetrics

Optional

Boolean

True or False

A flag indicating whether to compute the engagement and mood metrics. Setting this value to False may speed up the algorithm. Default is True.

emotionsComputeInterval

Optional

Integer

Minumum of 0

This value determines the processing interval for computing a subject's emotions in milliseconds. For instance, if set to 1000 an EmotionResult will only be produced every one second. Set to zero to compute an EmotionResult for every frame.

eyeOpennessComputeInterval

Optional

Integer

Minumum of 0

This value determines the processing interval for computing a subject's eye openness in milliseconds. For instance, if set to 1000 an EyeOpennessResult will only be produced every one second. Set to zero to compute an EyeOpennessResult for every frame.

faceAnalysisType

Optional

String

AnalyzeLargestDetectedFace or AnalyzeAllDetectedFaces

Determines whether the algorithm will compute metrics for all detected faces or only for the largest face. Setting this value to AnalyzeLargestDetectedFace may speed up the algorithm. Default is AnalyzeLargestDetectedFace.

faceDetectionInterval

Optional

Integer

Minumum of -1

A value that determines the time interval in milliseconds between applying the face detection process to the frames. Setting the value to -1 when using the AnalyzeLargestDetectedFace mode will result in face detection only being applied when the face being tracked is lost. Setting the value to -1 when using the AnalyzeAllDetectedFaces mode is equivalent to 0 (apply face detection on each frame).

genderComputeInterval

Optional

Integer

Minumum of 0

This value determines the processing interval for computing a subject's gender in milliseconds. For instance, if set to 1000 an GenderResult will only be produced every one second. Set to zero to compute a GenderResult for every frame.

identityComputeInterval

Optional

Integer

Minumum of 0

This value determines the processing interval for computing a subject's identity in milliseconds. For instance, if set to 1000 an IdentityResult will only be produced every one second. Set to zero to compute an IdentityResult for every frame.

imageFile

Required

Binary

The input image. Can be either a color or grayscale image.

metricComputeInterval

Optional

Integer

Minumum of 0

This value determines the processing interval for computing a subject's engagement and mood metrics in milliseconds. For instance, if set to 1000 a MetricResult will only be produced every one second. Set to zero to compute a MetricResult for every frame.

minimumFaceDetectionConfidence

Optional

Integer

Minumum of 1

The minimum confidence to accept a candidate face detection window. Takes an integer value with a minimum of 1. Default is 3.

minimumFaceWidth

Optional

Integer

1 to image width

Sets the minimum width of a face to be analyzed in pixels. Default is 20.

poseThreshold

Optional

Double

0 to PI/2

The maximum head pose radius to be processed in radians. This is computed as the angle between the optical axis of the acquisition sensor and the off-axis head pose vector. Default: PI/2 radians.

previousFrameResult

Required

String

The JSON serialized image frame result from the previous analyzed frame. The concept of this endpoint is to provide the previous frame result as the initial seed for analyzing the current frame. Set to null if analyzing the first frame in the sequence.

smoothingWindowTimeSpan

Optional

Integer

Minimum of 0

The time span in milliseconds of previous recognition results to include in performing the smoothing of the current recognition result. Default: 250.

timestamp

Required

Integer

Minimum of 0

The timestamp of the image frame in milliseconds. The timestamp should be relative to the first frame in the sequence.

Below is a sample response and a description of each of the output variables.

A Sample Image Response (JSON)
{
  "PreviousComputedAgeGroupTimestamp": 0,
  "PreviousComputedEmotionsTimestamp": 0,
  "PreviousComputedEyeOpennessTimestamp": 0,
  "PreviousComputedGenderTimestamp": 0,
  "PreviousComputedIdentityTimestamp": 0,
  "PreviousFaceDetectionTimestamp": 0,
  "FaceAnalysisSequenceResults": [
    {
      "MetricSequenceResult": {
        "Computed": true,
        "EmotionLift": 1,
        "Engagement": 0.879
      }
    }
  ],
  "Timestamp": 0,
  "FaceAnalysisResults": [
    {
      "AgeGroupResult": {
        "AgeGroup": "Adult",
        "Computed": true,
        "Confidence": 1
      },
      "EmotionResult": {
        "Anger": 0,
        "Computed": true,
        "Disgust": 0,
        "Fear": 0,
        "Joy": 1,
        "Neutral": 0,
        "Sadness": 0,
        "Surprise": 0
      },
      "EyeOpennessResult": {
        "Computed": true,
        "LeftEye": 0.967,
        "RightEye": 0.982
      },
      "FaceTrackerResult": {
        "FacePoints": [
          {
            "Type": "OuterRightEyeBrowCorner",
            "X": 38.097,
            "Y": 82.933
          },
          {
            "Type": "OuterMiddleRightEyeBrow",
            "X": 46.082,
            "Y": 79.325
          },
          .
          .
          .
          {
            "Type": "InnerLowerRightLip",
            "X": 67.394,
            "Y": 155.989
          }
        ],
        "FaceRectangle": {
          "Height": 87,
          "Left": 38,
          "Top": 77,
          "Width": 93
        },
        "HeadPose": {
          "Pitch": -4.247,
          "Roll": 8.047,
          "Yaw": -5.88
        },
        "TrackingId": 42
      },
      "GenderResult": {
        "Computed": true,
        "Confidence": 0.962,
        "Gender": "Female"
      },
      "IdentityResult": {
        "Identity": 0,
        "Computed": false,
        "Confidence": 0
      },
      "MetricResult": {
        "Attention": 0.879,
        "Computed": true,
        "Expressiveness": 1,
        "NegativeMood": 0,
        "PositiveMood": 1,
        "Valence": 0.5
      }
    }
  ],
  "ProcessingTime": 125,
  "Tracked": true
}
Response Variables

Name

Type

Range

Description

FaceAnalysisResults

An array containing all of the facial analysis results for each detected face.

FaceAnalysisResult.AgeGroupResult

Encompasses the age recognition results.

FaceAnalysisResult.AgeGroupResult.AgeGroup

String

Child, Young Adult, Adult, and Senior

The predicted age group label.

FaceAnalysisResult.AgeGroupResult.Confidence

Double

0 (no confidence) to 1 (maximum confidence)

The age group prediction confidence.

FaceAnalysisResult.EmotionResult

Encompasses the emotion recognition results. The emotion recognition module produces an intensity measure for seven universal expressions. All expressions are normalized and sum to 1.

FaceAnalysisResult.EmotionResult.Anger

Double

0 (absent) and 1 (maximum intensity)

The anger intensity measurement.

FaceAnalysisResult.EmotionResult.Disgust

Double

0 (absent) and 1 (maximum intensity)

The disgust intensity measurement.

FaceAnalysisResult.EmotionResult.Fear

Double

0 (absent) and 1 (maximum intensity)

The fear intensity measurement.

FaceAnalysisResult.EmotionResult.Joy

Double

0 (absent) and 1 (maximum intensity)

The joy intensity measurement.

FaceAnalysisResult.EmotionResult.Neutral

Double

0 (absent) and 1 (maximum intensity)

The neutral intensity measurement.

FaceAnalysisResult.EmotionResult.Sadness

Double

0 (absent) and 1 (maximum intensity)

The sadness intensity measurement.

FaceAnalysisResult.EmotionResult.Surprise

Double

0 (absent) and 1 (maximum intensity)

The surprise intensity measurement.

FaceAnalysisResult.FaceTrackerResult

Contains the face tracking results including the head pose information.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle

The rectangle enclosing the largest face detected in the image.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Height

Integer

1 to the image height.

The height of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Left

Integer

0 to the image width-1.

The left-most x coordinate of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Top

Integer

0 to the image height-1.

The top-most y coordinate of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Width

Integer

1 to the image width.

The width of the rectangle in pixels.

FaceAnalysisResult.FaceTrackerResult.HeadPose

Contains the head pose measurement with respect to the acquisition sensor in Euler angles.

FaceAnalysisResult.FaceTrackerResult.HeadPose.Pitch

Double

-90 to +90

The pitch angle in degrees.

FaceAnalysisResult.FaceTrackerResult.HeadPose.Roll

Double

-180 to +180

The roll angle in degrees.

FaceAnalysisResult.FaceTrackerResult.HeadPose.Yaw

Double

-90 to +90

The yaw angle in degrees.

FaceAnalysisResult.FaceTrackerResult.TrackingId

Integer

A unique tracking id for each face that is persisted across frames. A new id is assigned to a face that enters the FOV.

FaceAnalysisResult.GenderResult

Encompasses the gender recognition results.

FaceAnalysisResult.GenderResult.Gender

String

Female and Male

The predicted gender label.

FaceAnalysisResult.GenderResult.Confidence

Double

0 (no confidence) to 1 (maximum confidence)

The gender prediction confidence.

FaceAnalysisResult.IdentityResult

Contains the face recognition results for the given face.

FaceAnalysisResult.IdentityResult.Identity

Integer

The identity id of the recognized face based on the provided gallery. Returns -1 if the face was not identified.

FaceAnalysisResult.IdentityResult.Computed

Boolean

True or False

A flag indicating whether the face recognition metrics were computed.

FaceAnalysisResult.IdentityResult.Confidence

Boolean

0 and 1

The identity prediction confidence.

FaceAnalysisResult.MetricResult

Contains engagement and emotion metrics that are derived from the emotion recognition result and head pose information.

FaceAnalysisResult.MetricResult.Attention

Double

0 (no attention; the head pose vector is orthogonal to the sensor's optical axis) and 1 (full attention; the head pose vector is parallel to the sensor's optical axis)

The attention measurement. This value indicates how close the subject's head is facing towards the acquisition sensor.

FaceAnalysisResult.MetricResult.Computed

Boolean

True or False

A flag indicating whether the metric result was computed.

FaceAnalysisResult.MetricResult.Expressiveness

Double

0 to 1

Also referred to as the "interaction" metric, equals the highest value of the dominant emotion at that frame, or second dominant value in case Neutral for that frame is the dominant emotion.

FaceAnalysisResult.MetricResult.NegativeMood

Double

0 to 1

Calculated as the maximum negative emotion intensity. The negative emotions are Anger, Sadness, Fear, and Disgust.

FaceAnalysisResult.MetricResult.PositiveMood

Double

0 to 1

Calculated as the maximum positive emotion intensity. The positive emotions are Joy and Surprise.

FaceAnalysisResult.MetricResult.Valence

Double

0 to 0.5

Calculated as the average of the maximum negative emotion intensity (includes Anger, Sadness, Fear, and Disgust) and the maximum positive emotion intensity (includes Joy and Surprise).

ProcessingTime

Integer

The processing time in milliseconds.

Tracked

Boolean

True and False

A flag indicating whether a face was tracked in the image.

Video Endpoint

Use the video endpoint to process video files. All common video formats including .avi, .flv, .mpg, .mov, and .mp4 are supported. Below are the optional request headers and form encoded parameters for this endpoint.

Request Headers

Name

Condition

Type

Range

Description

LicenseKey

Required

String

Contact us

A license key string provided by us.

Form Encoded Parameters

Name

Condition

Type

Description

ageGroupComputeInterval

Optional

Integer

Minumum of 0

This value determines the processing interval for computing a subject's age group in milliseconds. For instance, if set to 1000 an AgeGroupResult will only be produced every one second. Set to zero to compute an AgeGroupResult for every frame.

computeAgeGroup

Optional

Boolean

True or False

A flag indicating whether to compute the age group metrics. Setting this value to False may speed up the algorithm. Default is True.

computeEmotions

Optional

Boolean

True or False

A flag indicating whether to compute the emotions metrics. Setting this value to False may speed up the algorithm. Default is True.

computeEyeOpenness

Optional

Boolean

True or False

A flag indicating whether to compute the eye openness measure. Setting this value to False may speed up the algorithm. Default is True.

computeGender

Optional

Boolean

True or False

A flag indicating whether to compute the gender metrics. Setting this value to False may speed up the algorithm. Default is True.

computeIdentity

Optional

Boolean

True or False

A flag indicating whether to compute the face recognition metrics. Setting this value to False may speed up the algorithm. Default is False.

computeMetrics

Optional

Boolean

True or False

A flag indicating whether to compute the engagement and mood metrics. Setting this value to False may speed up the algorithm. Default is True.

emotionsComputeInterval

Optional

Integer

Minimum of 0

This value determines the processing interval for computing a subject's emotions in milliseconds. For instance, if set to 1000 an EmotionResult will only be produced every one second. Set to zero to compute an EmotionResult for every frame.

eyeOpennessComputeInterval

Optional

Integer

Minimum of 0

This value determines the processing interval for computing a subject's eye openness in milliseconds. For instance, if set to 1000 an EyeOpennessResult will only be produced every one second. Set to zero to compute an EyeOpennessResult for every frame.

faceAnalysisType

Optional

String

AnalyzeLargestDetectedFace or AnalyzeAllDetectedFaces

Determines whether the algorithm will compute metrics for all detected faces or only for the largest face. Setting this value to AnalyzeLargestDetectedFace may speed up the algorithm. Default is AnalyzeLargestDetectedFace.

faceDetectionInterval

Optional

Integer

Minimum of 0

A value that determines the time interval in milliseconds between applying the face detection process to the frames. Setting the value to -1 when using the AnalyzeLargestDetectedFace mode will result in face detection only being applied when the face being tracked is lost. Setting the value to -1 when using the AnalyzeAllDetectedFaces mode is equivalent to 0 (apply face detection on each frame).

genderComputeInterval

Optional

Integer

Minimum of 0

This value determines the processing interval for computing a subject's gender in milliseconds. For instance, if set to 1000 an GenderResult will only be produced every one second. Set to zero to compute a GenderResult for every frame.

identityComputeInterval

Optional

Integer

Minimum of 0

This value determines the processing interval for computing a subject's identity in milliseconds. For instance, if set to 1000 an IdentityResult will only be produced every one second. Set to zero to compute an IdentityResult for every frame.

metricComputeInterval

Optional

Integer

Minimum of 0

This value determines the processing interval for computing a subject's engagement and mood metrics in milliseconds. For instance, if set to 1000 a MetricResult will only be produced every one second. Set to zero to compute a MetricResult for every frame.

minimumFaceDetectionConfidence

Optional

Integer

Minimum of 1

The minimum confidence to accept a candidate face detection window. Takes an integer value with a minimum of 1. Default is 3.

minimumFaceWidth

Optional

Integer

1 to image width

Sets the minimum width of a face to be analyzed in pixels. Default is 20.

poseThreshold

Optional

Double

0 to PI/2

The maximum head pose radius to be processed in radians. This is computed as the angle between the optical axis of the acquisition sensor and the off-axis head pose vector. Default: 0.3491 radians.

processingFrameRate

Optional

Double

0 to Double.Max

The number of frames to process in a video second. For example, if an input video has a frame rate of 30fps and the processingFrameRate is set to 15, only 15 of the 30 frames per second will be processed. The request response will return the results of only half of the frames in this case. This parameter is useful if the user does not need to process every frame in a video, speeding up the processing time. Default: 0 (processes all frames in the video).

viewedMediaDuration

Optional

Double

0 to Double.Max

This parameter is used to synchronize the result of a streamed web cam video with the stimuli presented to the end user. The scenario for which this parameter is useful is when there is buffering of the stimuli presented to the end user due to a poor internet connection and the recorded video of the end user watching the stimuli does not synchronize with the time stamps of the stimuli. The viewedMediaDuration sets the known duration of the stimuli in seconds. The result of the recorded video is then compressed to match the viewedMediaDuration. Default: 0 (result duration matches input video).

outputFrameRate

Optional

Double

0 to Double.Max

Resamples the result to the specified frame rate in frames per second. This parameter is useful when the frame rate of the stimuli presented to the end user differs from the frame rate of the recorded web cam video of the end user watching the stimuli. The outputFrameRate parameter sets the resampling frame rate of the request response to the specified frame rate of the stimuli video. Default: 0 (no resampling).

smoothingWindowTimeSpan

Optional

Integer

Minimum of 0

The time span in milliseconds of previous recognition results to include in performing the smoothing of the current recognition result. Default: 250.

videoFile

Required

Binary

The input video file.

Below is a sample response and the description of each of the output variables.

A Sample Video Response (JSON)
{
  "VideoFrameResults": [
    {
      "FaceAnalysisSequenceResults": [
        {
          "MetricSequenceResult": {
            "Computed": true,
            "EmotionLift": 0.14,
            "Engagement": 0.522
          }
        }
      ],
      "Timestamp": 0.0,
      "FaceAnalysisResults": [
        {
          "AgeGroupResult": {
            "AgeGroup": "Adult",
            "Computed": true,
            "Confidence": 1.0
          },
          "EmotionResult": {
            "Anger": 0.14,
            "Computed": true,
            "Disgust": 0.106,
            "Fear": 0.029,
            "Joy": 0.003,
            "Neutral": 0.672,
            "Sadness": 0.046,
            "Surprise": 0.005
          },
          "FaceTrackerResult": {
            "FaceRectangle": {
              "Height": 246,
              "Left": 148,
              "Top": 5,
              "Width": 294
            },
            "HeadPose": {
              "Pitch": 14.089,
              "Roll": 4.622,
              "Yaw": -0.642
            },
            "TrackingId": 42
          },
          "GenderResult": {
            "Computed": true,
            "Confidence": 0.868,
            "Gender": "Female"
          },
          "IdentityResult": {
            "Identity": -1,
            "Computed": false,
            "Confidence": 0.0
          },
          "MetricResult": {
            "Attention": 0.765,
            "Computed": true,
            "Expressiveness": 0.14,
            "NegativeMood": 0.14,
            "PositiveMood": 0.005,
            "Valence": 0.074
          }
        }
      ],
      "ProcessingTime": 120,
      "Tracked": true
    },
    {
      "FaceAnalysisSequenceResults": [
        {
          "MetricSequenceResult": {
            "Computed": true,
            "EmotionLift": 0.14,
            "Engagement": 0.522
          }
        }
      ],
      "Timestamp": 33.333,
      "FaceAnalysisResults": [
        {
          "AgeGroupResult": {
            "AgeGroup": "Adult",
            "Computed": true,
            "Confidence": 1.0
          },
          "EmotionResult": {
            "Anger": 0.14,
            "Computed": true,
            "Disgust": 0.106,
            "Fear": 0.029,
            "Joy": 0.003,
            "Neutral": 0.672,
            "Sadness": 0.046,
            "Surprise": 0.005
          },
          "FaceTrackerResult": {
            "FaceRectangle": {
              "Height": 246,
              "Left": 148,
              "Top": 5,
              "Width": 294
            },
            "HeadPose": {
              "Pitch": 14.089,
              "Roll": 4.622,
              "Yaw": -0.642
            },
            "TrackingId": 42
          },
          "GenderResult": {
            "Computed": true,
            "Confidence": 0.868,
            "Gender": "Female"
          },
          "IdentityResult": {
            "Identity": -1,
            "Computed": false,
            "Confidence": 0.0
          },
          "MetricResult": {
            "Attention": 0.765,
            "Computed": true,
            "Expressiveness": 0.14,
            "NegativeMood": 0.14,
            "PositiveMood": 0.005,
            "Valence": 0.074
          }
        }
      ],
      "ProcessingTime": 121,
      "Tracked": true
    }
  ]
}
Response Variables

Name

Type

Range

Description

VideoFrameResults

An array of video frame results which encompass all of the results for each of the frames processed in the video.

VideoFrameResult.FaceAnalysisResults

An array containing all of the facial analysis results for each detected face.

VideoFrameResult.FaceAnalysisResult.AgeGroupResult

Encompasses the age recognition results.

VideoFrameResult.FaceAnalysisResult.AgeGroupResult.AgeGroup

String

Child, Young Adult, Adult, and Senior

The predicted age group label.

VideoFrameResult.FaceAnalysisResult.AgeGroupResult.Confidence

Double

0 (no confidence) to 1 (maximum confidence)

The age group prediction confidence.

VideoFrameResult.FaceAnalysisResult.EmotionResult

Encompasses the emotion recognition results. The emotion recognition module produces an intensity measure for seven universal expressions. All expressions are normalized and sum to 1.

VideoFrameResult.FaceAnalysisResult.EmotionResult.Anger

Double

0 (absent) and 1 (maximum intensity)

The anger intensity measurement.

VideoFrameResult.FaceAnalysisResult.EmotionResult.Disgust

Double

0 (absent) and 1 (maximum intensity)

The disgust intensity measurement.

VideoFrameResult.FaceAnalysisResult.EmotionResult.Fear

Double

0 (absent) and 1 (maximum intensity)

The fear intensity measurement.

VideoFrameResult.FaceAnalysisResult.EmotionResult.Joy

Double

0 (absent) and 1 (maximum intensity)

The joy intensity measurement.

VideoFrameResult.FaceAnalysisResult.EmotionResult.Neutral

Double

0 (absent) and 1 (maximum intensity)

The neutral intensity measurement.

VideoFrameResult.FaceAnalysisResult.EmotionResult.Sadness

Double

0 (absent) and 1 (maximum intensity)

The sadness intensity measurement.

VideoFrameResult.FaceAnalysisResult.EmotionResult.Surprise

Double

0 (absent) and 1 (maximum intensity)

The surprise intensity measurement.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult

Contains the face tracking results including the head pose information.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.FaceRectangle

The rectangle enclosing the largest face detected in the image.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Height

Integer

1 to the image height.

The height of the rectangle in pixels.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Left

Integer

0 to the image width-1.

The left-most x coordinate of the rectangle in pixels.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Top

Integer

0 to the image height-1.

The top-most y coordinate of the rectangle in pixels.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.FaceRectangle.Width

Integer

1 to the image width.

The width of the rectangle in pixels.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.HeadPose

Contains the head pose measurement with respect to the acquisition sensor in Euler angles.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.HeadPose.Pitch

Double

-90 to +90

The pitch angle in degrees.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.HeadPose.Roll

Double

-180 to +180

The roll angle in degrees.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.HeadPose.Yaw

Double

-90 to +90

The yaw angle in degrees.

VideoFrameResult.FaceAnalysisResult.FaceTrackerResult.TrackingId

Integer

A unique tracking id for each face that is persisted across frames. A new id is assigned to a face that enters the FOV.

VideoFrameResult.FaceAnalysisResult.GenderResult

Encompasses the gender recognition results.

VideoFrameResult.FaceAnalysisResult.GenderResult.Gender

String

Female and Male

The predicted gender label.

VideoFrameResult.FaceAnalysisResult.GenderResult.Confidence

Double

0 (no confidence) to 1 (maximum confidence)

The gender prediction confidence.

VideoFrameResult.FaceAnalysisResult.IdentityResult

Contains the face recognition results for the given face.

VideoFrameResult.FaceAnalysisResult.IdentityResult.Identity

Integer

The identity id of the recognized face based on the provided gallery. Returns -1 if the face was not identified.

VideoFrameResult.FaceAnalysisResult.IdentityResult.Computed

Boolean

True or False

A flag indicating whether the face recognition metrics were computed.

VideoFrameResult.FaceAnalysisResult.IdentityResult.Confidence

Boolean

0 and 1

The identity prediction confidence.

VideoFrameResult.FaceAnalysisResult.MetricResult

Contains engagement and emotion metrics that are derived from the emotion recognition result and head pose information.

VideoFrameResult.FaceAnalysisResult.MetricResult.Attention

Double

0 (no attention; the head pose vector is orthogonal to the sensor's optical axis) and 1 (full attention; the head pose vector is parallel to the sensor's optical axis)

The attention measurement. This value indicates how close the subject's head is facing towards the acquisition sensor.

VideoFrameResult.FaceAnalysisResult.MetricResult.Computed

Boolean

True or False

A flag indicating whether the metric result was computed.

VideoFrameResult.FaceAnalysisResult.MetricResult.Expressiveness

Double

0 to 1

Also referred to as the "interaction" metric, equals the highest value of the dominant emotion at that frame, or second dominant value in case Neutral for that frame is the dominant emotion.

VideoFrameResult.FaceAnalysisResult.MetricResult.NegativeMood

Double

0 to 1

Calculated as the maximum negative emotion intensity. The negative emotions are Anger, Sadness, Fear, and Disgust.

VideoFrameResult.FaceAnalysisResult.MetricResult.PositiveMood

Double

0 to 1

Calculated as the maximum positive emotion intensity. The positive emotions are Joy and Surprise.

VideoFrameResult.FaceAnalysisResult.MetricResult.Valence

Double

0 to 0.5

Calculated as the average of the maximum negative emotion intensity (includes Anger, Sadness, Fear, and Disgust) and the maximum positive emotion intensity (includes Joy and Surprise).

VideoFrameResult.FaceAnalysisResult.MetricSequenceResult

Contains sequence-dependent engagement and emotion metrics that are derived from a sequence of emotion recognition and head pose results.

VideoFrameResult.FaceAnalysisResult.MetricSequenceResult.EmotionLift

Double

0 to 1

Calculated as the average of the maximum emotion response of the current frame and the frame that took place one second prior. For all frames within the first second of the video, the emotion lift is equal to the maximum emotion response of the current frame.

VideoFrameResult.FaceAnalysisResult.MetricSequenceResult.Engagement

Double

0 to 1

Calculated as the minimum value between VideoFrameResult.MetricResult.Attention and the average of the VideoFrameResult.MetricResult.Expressiveness and VideoFrameResult.MetricSequenceResult.EmotionLift variables.

VideoFrameResult.ProcessingTime

Integer

The processing time in milliseconds.

VideoFrameResult.Tracked

Boolean

True and False

A flag indicating whether a face was tracked in the image.

VideoFrameResult.Timestamp

Integer

0 to Integer.Max

The time stamp of the video frame in milliseconds.