The boundary between Quantum and Newtonian data quality is a grey area which is made clear through prioritizing business value. The same way you choose one physics model over the other for the sake of addressing a real world problem, so you must choose the right level of controls over the usage and quality of data. A detailed analysis of the weight of every sand grain, will cause you to spend too much time and money in building a house. The grains have to be within a certain weight tolerance, and rough estimations are "good enough" - so simple, cost-effective filters will do the job just fine.
Another example which is closer to data management than the weight of sand is good-enough parenting. If you protect your child from making any bad decisions (such as insisting to wear slippers to go and play in the rain) - they might end up becoming overly dependent on your judgment and will never learn why this choice of footwear is not such a great idea.
So what do parents do? They carefully choose when to get involved, and when to step back and let the child learn from their own mistakes. The idea is to increase the effectiveness of an independent thinker while still maintaining reasonable control.
But there is another side to it, and that is resource costs (like in the sand example). A parent typically has a million things to do, and being pedantic is not necessarily productive in every situation. You need to utilize your time and money effectively, and hence you need to allow some risk and some impurities for the sake of progress.
Therefore... to maximize the value of your data, your oversight and management of it must be good-enough, or fit for purpose. Too little management you will need to endure higher risks and remediation costs. Too much control - and you will slow the business down, incur unnecessary costs and will loose your credibility.
The reason each organization requires a different data management operating model is much like the fact that each child requires its own parenting style.