Skip to main content

Prediction Accuracy (Confusion Matrix)

Learn how to read and interpret the Prediction Accuracy gauge and Confusion Matrix in your DORA Prediction Map.

Updated over a week ago

Overview

This article explains how to interpret the Prediction Accuracy gauge and Confusion Matrix on your Prediction Map. These outputs help you evaluate how well the model distinguishes between mineralized and barren areas, and what to do if results are not reliable.


Prediction Accuracy Gauge

When you first open your results, you’ll see a Prediction Accuracy display in the form of a circular gauge, or odometer, showing:

  • Accuracy Score – Percentage of correct predictions

  • Label – Qualitative status based on accuracy range

Prediction Accuracy gauge

The labels are applied as follows:

An Optimal score means the model is generalizing well based on your entire data set and input settings.

To explore the results further, click the arrow in the upper right to view detailed model outputs.

Prediction Accuracy gauge with arrow


What is a Confusion Matrix?

Below the gauge, you’ll find the Confusion Matrix. A Confusion Matrix shows how well the model's predictions match actual outcomes from your entire dataset.

Your Learning Data, which was configured in Step 3: Set Up Learning Data, is the foundation for this output. It helps the model learn to distinguish between mineralized and unmineralized zones, which is reflected in the matrix results.

This matrix contains four key outcomes:

  • True Positive (TP): The model correctly predicts a mineralized location.

  • True Negative (TN): The model correctly predicts a barren location.

  • False Positive (FP): The model incorrectly predicts a mineralized location when it’s actually barren.

  • False Negative (FN): The model incorrectly predicts a barren location when it’s actually mineralized.

These outcomes are arranged in a grid format, making it easy to see how many of each type the model is producing.

Precision measures how many of the locations predicted as mineralized actually are.

How Precision is calculated: True Positive / (True Positive + False Positive)

Recall measures how many of the actual mineralized locations the model successfully identified.

How Recall is calculated: True Positive / (True Positive + False Negative)

  • A high Precision score means the model avoids false positives.

  • A high Recall score means it finds most of the true mineralized areas.

  • The F1 Score is a balance of both Precision and Recall, helping you judge overall reliability.


How to Interpret

Ideally, True Positives and True Negatives should be high, while False Positives and False Negatives should be as low as possible.

  • High True Positives confirm that the model identifies mineralized areas correctly.

  • High True Negatives confirm that barren areas are ruled out reliably.

  • High False Positives mean barren ground is flagged incorrectly as prospective, leading to wasted effort.

  • High False Negatives mean potential mineralized ground is missed.

It’s also important to check for overfitting (when a model is too complex and performs poorly on new data) and underfitting (when a model is too simple and misses key patterns). A balanced model should generalize well across unseen data.

We prioritize maximizing True Negatives in our modeling (over True Positives) to rule out barren areas more accurately, ensuring mineral systems are not overlooked. Following up on targets that yield poor results is a more pragmatic exploration approach than missing out on potential discoveries.


What to Do if Results Are Not Optimal

If you see a high rate of False Positives (low Precision score) and/or False Negatives (low Recall score), it means the model is not providing reliable predictions. This is typically reflected in a low F1 Score, which combines both Precision and Recall.

In this case, you may need to adjust your inputs, parameters, or model settings to improve performance.

Try the following adjustments, in this order:

  1. Revise model parameters

  2. Review your features

  3. Revise your target thresholds

  4. Adjust the AOI resolution

    • In Step 1: Select AOI, modify the height and width of your Area of Interest to better match geological context and data coverage.

  5. Review Learning Data files


Learn More


Still Have Questions?

Reach out to your dedicated DORA contact or email support@VRIFY.com for more information.

Did this answer your question?