Evaluation Metrics¶
This phase is entirely based on individual questions . It allows you to create custom evaluation criteria across different measurement types. You can create Evaluation Metrics in the Evaluation Forms section utilizing these different measurement types.
You can access the Evaluation Metrics by going to Contact Center AI > Quality Management > Configure > Evaluation Forms > Evaluation Metrics.
The Evaluation Metrics has the following options:
- Name: Shows the name of the Evaluation Metrics.
- Metric Type: Shows the Evaluation Metrics Type (Measurement Type) selected.
- Evaluation Forms: Shows the Evaluation Forms used for configuring and assigning the evaluation metrics to different channels and queues.
- Edit: Allows you to edit or update the existing Evaluation Metrics.
- Delete: Allows you to select and delete any Evaluation Metrics shown on the Evaluation Forms page.
- Search: Provides a quick search option to view and update the Evaluation Metrics by name only.
Add New Evaluation Metrics¶
You can access the Evaluation Metrics by going to Contact Center AI > Quality Management > Configure > Evaluation Metrics > New Evaluation Metrics.
Steps to create New Evaluation Metrics:
-
Click the New Evaluation Metric button in the upper-right corner to configure the most commonly used evaluation metrics. The following screen appears, allowing you to select a type of evaluation metrics measurement.
-
Select the type of Evaluation Metrics Measurement, such as By Question, By Speech, By Playbook Adherence, and By Dialog Task.
The following table describes the Evaluation Metrics Measurement Types:
Evaluation Metrics Measurement Types | Description |
By Question: Configures the metric and expected responses based on a specific question. | |
Name | Enter a name for the future reference of the metrics. |
Question | Provides reference to the supervisor about audit and interaction evaluation. You enter a question for which adherence check is done. |
Adherence Type | Provides the following two types of adherences:
|
Answer | Provides the expected answers relevant to your question (a few different utterances) entered with the help of generative AI suggestions, which have similar utterances with the same meaning and reduce the set up time. In this, you can enter or select more than one expected answer using generative AI having different utterances matching your question. In addition, you have the option to delete the added answers. If it is Static, then you need to define a similar percentage for the metric based on the defined use case and attribute.
|
Agent Attribute (Optional) | Provides an evaluation metric, which you can assign to only one agent attribute. |
Evaluation Metrics Measurement Types | Description |
By Speech: Configures a metric based on speech attributes like dead air and speaking rate. | |
Name | Enter a name for the future reference of the metrics. |
Speech Type | Provides the following Speech Type options to select:
Cross Talk Metric Qualification: If the no. of Cross Talk instances, based on the configured Cross Talk duration is less than the configured no. of instances for that Evaluation Form, the metric is qualified (an occurrence is evaluated as Cross Talk, which must be equal to or exceed the configured Cross Talk duration). Similarly, if the number of Cross Talk instances exceeds the no. of instances limit, it is considered a failure for that metric, and the agent will be penalized. |
Evaluation Metrics Measurement Types | Description |
By Dialog Task: Configures a metric based on adherence to execution of dialog tasks. | |
Name | Enter a name for the future reference of the metrics. |
Select Dialog Task | Select a Dialog Task from the drop-down list given. |
Count Type | Provides the following two options based on the Count Type selected.
|
Agent Attribute (Optional) | Provides an evaluation metric, which you can assign to only one agent attribute. |
Evaluation Metrics Measurement Types | Description |
By Playbook Adherence: Configures a metric based on adherence to a playbook or a specific playbook step. | |
Name | Enter a name for the future reference of the metrics. |
Playbook Name | Select a Playbook Name from the drop-down list, from which the metric should evaluate adherence. |
Adherence Type |
From the Adherence Type, you can choose either Entire Playbook or Steps to do the following:
From the Adherence Type, if you select Steps, then you will get the following options: |
Agent Attribute (Optional) | Provides an evaluation metric, which you can assign to only one agent attribute. |
Edit Evaluation Metrics¶
Steps to edit existing Evaluation Metrics:
-
Right-click to select any of the existing Evaluation Metrics (Name). The following screen appears, allowing you to select a type of evaluation metrics measurement.
-
Click Edit to update the Evaluation Metrics dialog box fields. The following dialog box appears to update the required fields.
-
Edit the required fields that you want to update.
Note
All the fields are editable except the Evaluation Metrics Measurement Type and Agent Attribute (Optional).
-
Click Update to save the changes.