QA
Sometimes you have a model and you want to test its accuracy by comparing it to human annotations. This is a simple process that can be done using our platform.
First you need your dataset to be ready to import. For that, all annotations must either have one annotation per column or have them all in one column separated by " | ". Here are two examples for the second case:
- For classification projects: happy|proud|joyful
- For projects with entities: if the sentece is "it’s 11am in Spain, 5am in Mexico” the annotations should look like this: time:11am|country:Spain|time:5am|country:Mexico
Once your dataset is ready, upload it and select the annotations you want to be shown for the project. For further information about how to upload an annotated dataset, please go to "Datasets" > "Create a Dataset"
Now the dataset is ready for the annotator to revise. They will be able to see the annotations given by the dataset but they will also be able to select the label or tag the task as they wish - if they don't make any changes the tasks will be considered completed using the annotations from the dataset.
Finally, you will be able to download those annotations and compare them with the ones made by the model.