Overview
This article explains how to interpret the Prediction Accuracy gauge and Confusion Matrix on your Prediction Map. These outputs help you evaluate how well the model distinguishes between mineralized and barren areas, and what to do if results are not reliable.
Prediction Accuracy Gauge
When you first open your results, you’ll see a Prediction Accuracy display in the form of a circular gauge, or odometer, showing:
Accuracy Score – Percentage of correct predictions
Label – Qualitative status based on accuracy range
The labels are applied as follows:
0–75% = Underfitted
76–90% = Optimal
91–100% = Overfitted
An Optimal score means the model is generalizing well based on your entire data set and input settings.
To explore the results further, click the arrow in the upper right to view detailed model outputs.
What is a Confusion Matrix?
Below the gauge, you’ll find the Confusion Matrix. A Confusion Matrix shows how well the model's predictions match actual outcomes from your entire dataset.
Your Learning Data, which was configured in Step 3: Set Up Learning Data, is the foundation for this output. It helps the model learn to distinguish between mineralized and unmineralized zones, which is reflected in the matrix results.
This matrix contains four key outcomes:
True Positive (TP): The model correctly predicts a mineralized location.
True Negative (TN): The model correctly predicts a barren location.
False Positive (FP): The model incorrectly predicts a mineralized location when it’s actually barren.
False Negative (FN): The model incorrectly predicts a barren location when it’s actually mineralized.
These outcomes are arranged in a grid format, making it easy to see how many of each type the model is producing.
Precision measures how many of the locations predicted as mineralized actually are.
How Precision is calculated: True Positive / (True Positive + False Positive)
Recall measures how many of the actual mineralized locations the model successfully identified.
How Recall is calculated: True Positive / (True Positive + False Negative)
A high Precision score means the model avoids false positives.
A high Recall score means it finds most of the true mineralized areas.
The F1 Score is a balance of both Precision and Recall, helping you judge overall reliability.
How to Interpret
Ideally, True Positives and True Negatives should be high, while False Positives and False Negatives should be as low as possible.
High True Positives confirm that the model identifies mineralized areas correctly.
High True Negatives confirm that barren areas are ruled out reliably.
High False Positives mean barren ground is flagged incorrectly as prospective, leading to wasted effort.
High False Negatives mean potential mineralized ground is missed.
It’s also important to check for overfitting (when a model is too complex and performs poorly on new data) and underfitting (when a model is too simple and misses key patterns). A balanced model should generalize well across unseen data.
We prioritize maximizing True Negatives in our modeling (over True Positives) to rule out barren areas more accurately, ensuring mineral systems are not overlooked. Following up on targets that yield poor results is a more pragmatic exploration approach than missing out on potential discoveries.
What to Do if Results Are Not Optimal
If you see a high rate of False Positives (low Precision score) and/or False Negatives (low Recall score), it means the model is not providing reliable predictions. This is typically reflected in a low F1 Score, which combines both Precision and Recall.
In this case, you may need to adjust your inputs, parameters, or model settings to improve performance.
Try the following adjustments, in this order:
Revise model parameters
In Step 5: Build Predictive Model, adjust the cluster size, and minimum points per cluster.
Explore Advanced Settings for more control over algorithm performance.
Review your features
In Step 2: Select Input Features, confirm that only relevant geoscience layers are included. Remove noisy or unrelated inputs.
Revise your target thresholds
In Step 3: Set Up Learning Data, revise target thresholds.
Adjust the AOI resolution
In Step 1: Select AOI, modify the height and width of your Area of Interest to better match geological context and data coverage.
Review Learning Data files
Check your Learning Points shapefile (Step 3: Set Up Learning Data) to ensure that data is accurate and covers the AOI.
Learn More
Interpret other DORA’s Result Graphs:
Create a DORA Prediction Map:
Still Have Questions?
Reach out to your dedicated DORA contact or email support@VRIFY.com for more information.






