All Collections
System Practice
How is Data Quality Quantified and Monitored?
How is Data Quality Quantified and Monitored?

Data Badness is the quantification of input data quality

Mark de Geus avatar
Written by Mark de Geus
Updated over a week ago

Each digital process model requires good quality data to run, and the models will check this data quality while performing calculations and predictions. Each input variable for the model is constantly checked, using historical performance and the models own tolerances. These are tracked and stored by brains.VOS.

Data quality metrics can be quantified by a numerical metric, which we call "badness" (as a %), or it may simply be a status — "OK", "Warning" or "Error". When a badness metric is calculated, there will also be corresponding status metrics, with thresholds determining what values of badness correspond to which status. Poor data quality can prevent a model from performing its calculations

Below is an example where over time the bed pressure (blue line) initially drifted very far away from the normal operating range (red and green lines) - triggering a data quality alert. As the bed pressure reached normal operating ranges the torque flat-lined and then spiked - indicating abnormal operations.
Note: The brains.VOS output for the predicted state of these monitoring variables (yellow line) stopped during high data badness.

Did this answer your question?