Overview
In this article, you’ll learn how to read and interpret the Feature Importance graphs on a DORA Prediction Map. This graph helps you understand which features (such as geology, geophysics, or geochemistry) influenced the model’s predictions, and what to do if the model appears to be focusing on the wrong inputs.
What is Feature Importance?
Feature Importance graphs (also known as SHAP values) help explain how the features impact the AI model predictions. They show how much each input feature contributed to the model’s decision-making.
In DORA, these features might include geological structures, geochemical element concentrations, geophysical anomalies, and more.
Each feature is assigned a Feature Importance percentage, indicating how each feature pushed the pixel towards high chances of being mineralized or low chances of being mineralized.
The Feature Importance graphs help you:
See which features the model relied on most.
See how each feature, low and high values, changes the chances of mineralization.
Validate whether the model’s focus aligns with your exploration understanding.
How to Interpret
The Feature Importance graphs help you understand which features influenced DORA’s predictions, and how.
Here’s how to read them:
Feature Ranking (Y-axis): Features are listed from most important (top) to least important (bottom), based on how much they influenced the model’s predictions.
SHAP Value (X-axis): Shows how much a feature pushed the prediction away from a neutral 50% guess.
Right = pushed the prediction toward mineralization
Left = pushed it away from mineralization
Color of Data Points: Each data point represents one pixel in your AOI.
Dark Pink/Red = high input value
Blue = low input value
The height of the histogram/thickness of the bee swarm represents the data density.
For example:
If dark pink or red points for a feature are clustered on the right side of the SHAP plot, it means high values of that feature increased the probability of mineralization.
If they’re clustered on the left, it means high values decreased the probability.
In contrast, if blue points (representing low values) are clustered on the right, then low values of that feature increased the probability of mineralization, suggesting an inverse relationship.
If blue points are concentrated on the left, low values decreased the probability.
DORA's baseline prediction for a pixel is 50%, which translates to a coin toss, or a neutral guess. Feature Importance show how features push that prediction higher or lower:
A SHAP value of +0.3 might move a pixel’s probability from 50% → 80%.
A SHAP value of –0.2 might lower it to 30%.
The more consistently a feature pushes predictions away from 50%, the more important it is considered. If all data points for a feature are tightly clustered near zero, that feature didn’t have much influence on the model.
What To Do If Results Aren’t Optimal
If the graph is showing unexpected results, your model may need adjustment. Some warning signs include:
A feature you expected to be important (like a key geophysical anomaly) shows low importance
A feature that seems unrelated to your deposit model ranks high
This may mean the model is relying on noisy or irrelevant data, or your input features may not be well aligned with your exploration goals.
⚠️ Avoid Confirmation Bias: Unexpected results may signal the need to adjust model parameters — or they may reveal surprising insights. Think about this critically and reach out to the VRIFY team to help validate. |
Try these steps:
Revisit your selected features.
Go back to Step 2: Select Input Features and remove any features that are not geologically relevant or add features you think are missing.
Review your training data.
Ensure your learning points from Step 3: Set Up Learning Data accurately represent what you consider mineralized or unmineralized.
Check for data quality issues.
Inaccurate, inconsistent, or sparse data can skew the model’s understanding of what matters. Make sure your key input layers are complete and aligned with the AOI.
Learn More
Interpret other DORA’s Result Graphs:
Create a DORA Prediction Map:
Still Have Questions?
Reach out to your dedicated DORA contact or email support@VRIFY.com for more information.

