After you build the model, you can evaluate it by using various accuracy measures,
such as F-score, precision, recall, and so on. You can view the model evaluation report in
the application or you can download that report to your directory. You can also view the
test results for each record.
By using in-depth model analysis, you can determine whether the model that you
created produces the results that you expect and reaches your accuracy threshold. By
viewing record-by-record test results, you can fine-tune the training data to make your
model more accurate when you rebuild it.
-
To download the model evaluation report, perform the following actions:
-
In the Model analysis step, after the model finishes building, click
Download report.
-
Save the
Model Analysis Report
archive file to a
local directory.
-
Unpack the archive file.
The archive file contains the following .csv extension files:
-
test_CRF_
id_number
–
Contains all test records. For each test record, you can view
the result that you predicted (manual outcome), the result that
the model predicted (machine outcome), and whether these result
match.
-
test_CRF_SCORE_SHEET_
id_number
– Contains accuracy measures for each entity in the model, for
example, the number of true positives, precision, recall, and
F-score.
-
test_DATA_SHEET_
id_number
– Contains all testing and training records.
-
To view the summary results in the Analytics Center:
-
Click the Expand icon next to the model name.
-
In the
Category summary
tab, view the number of
true positives, precision, recall, and F-score results per each entity
type.
-
In the
Test results
tab, for each test record,
view the result that you predicted (actual), the result that the model
predicted (predicted), and whether these results match.