Photo: © Andis Rea |
The American Statistical Association (ASA) has come forward with an act of public humility that few people among the public will pay attention to. It has now admitted that over-reliance on statistics may be dangerous for our health. In an interview with the title, “Time to say goodbye to ‘statistically significant’ and embrace uncertainty, say statisticians,” we learn, for example, that “relying on statistical significance alone often results in weak science” and that, contrary to the illusion many have maintained about statistical evidence, “pure objectivity can never be achieved.”
In a world that is preparing to integrate artificial intelligence (AI) into every level of institutional decision-making, this could indicate a methodological breakthrough with far-reaching effects. AI both uses and produces statistics to make the decisions we so willingly accept to delegate to it. By acknowledging that uncertainty is more certain than supposed statistical truth, we may begin to situate our own decision-making responsibilities, based on factors other than numbers alone.
Here is today’s 3D definition:
Statistically significant:
Indicating that a certain representation of quantitative data may serve to justify ideas or initiatives that we fail to understand or, in some cases, refuse to understand...
But the issue strikes even deeper into our civilizational values as statistician Nicole Lazar, the interviewee of the article, acknowledges this fact containing vast cultural significance: “Categorization and categorical thinking are the fundamental problems, not the p-value in and of itself.” When we apply mathematical reasoning to human problems, our dependence on both language and the pragmatics of human activity force us to call things and ideas by names we invent and to relate them to each other by grouping them in categories, or what psychologists call “cognitive boundaries.” This has never been truer than in the digital civilization that we now depend on.
Read more...
Source: Fair Observer