PLEASE NOTE: UiPath Communications Mining's Knowledge Base has been fully migrated to UiPath Docs. Please navigate to equivalent articles in UiPath Docs (here) for up to date guidance, as this site will no longer be updated and maintained.

Knowledge Base

Model Training & Maintenance

Guides on how to create, improve and maintain Models in Communications Mining, using platform features such as Discover, Explore and Validation

Adding new labels to existing taxonomies

User permissions required: 'View Sources' AND 'Review and label'

 

If you have a pre-existing mature taxonomy, with many reviewed verbatims, adding a new label requires some additional training to bring it in line with the rest of the labels in the taxonomy.

 

When adding a new label to a well-trained taxonomy, you need to make sure to apply it to previously reviewed verbatims if the label is relevant to them.


If you do not, the model will have effectively been taught that the new label should not apply to them and will struggle to predict the new label confidently.


The more reviewed examples there are in the dataset, the more training this will require when adding a new label (unless it's an entirely new concept that you won't find in older data, but will find in much more recent data).


Key steps:

 

Create the new label when you find an example where it should apply


Find other examples where it should apply using a few different methods:

 

  1. You can search for key terms or phrases using search function in Discover to find similar instances - this way you apply the label in bulk if there are lots of similar examples in the search results
  2. Or you can search for key terms or phrases in Explore - this is potentially a better method as you can filter to 'Reviewed' verbatims, and searching in Explore returns an approximate count of the number of verbatims that match your search terms
  3. You can also select labels that you think might often appear alongside your new label, and review the pinned examples for that label to find examples where your new label should be applied
  4. Once you have a few pinned examples, see if it starts to get predicted in 'Label' mode - if it does, add more examples using this mode
  5. Lastly, if you're labelling in a sentiment enabled dataset, and your new label is typically either positive or negative you can also choose between positive and negative sentiment when looking at reviewed examples (though at present you cannot combine 'text search' with the 'reviewed' filter AND a sentiment filter)


Then use 'Missed label' to find more verbatims where the platform thinks the new label should have been applied:

 

  • Once you have labelled quite a few examples using the methods above and the model has had time to retrain, use the 'Missed label' functionality in Explore by selecting your label and then select 'Missed label' from the dropdown menu
  • This will show you reviewed verbatims where the model thinks the select label may have been missed in the previously reviewed examples
  • In these instances, the model will show the label as a suggestion (as shown in the example below)
  • Apply the label to all of the verbatims that the model correctly thinks the label should have been applied to
  • Keep training in this page until you have labelled all of the correct examples, and this mode no longer shows you examples where the label should actually apply


 

Example verbatim where the model correctly suggests that 'Claim > Confirmation > Payment' has been missed

 

Then check how the new label performs in the Validation page (once the model has had time to retrain and calculate the new validation statistics) and see if more training is required.


Previous: Deleting a pinned model     |      Next: Maintaining a Model in production

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.

Sections