# Evaluation of Reported Statistical Inferences

## Kale and Protein Powder Smoothies!

You may find yourself being offered a Kale or perhaps a Protein powder smoothy with the spiel, based on some study supporting their use, of some supported benefit – and you are beginning to find that your usual way out with the ‘Honey, this is so good I can do with just a little’ isn’t working! Well here are a set of tabs with calculators which may help! The influence of statistical inferences, we know, extends beyond such prosaic examples. The New York Times featured an article in August 2018 debunking some of the reported studies supporting testing for Vitamin D deficiencies and the recommendation of large supplemental doses of Vitamin D. Some of these Vitamin D claims, among other claims, were reported as not holding up on replication in controlled trials (See Young, Karr and Deming, Significance, 2011).

Someone or the other is always pointing to a published study to justify a point of view or the need for a change in what we do or how we live. There are so many such studies, many reported in top-notch journals, reporting results inconsistent across and often inconsistent within. It is in the interest of increasing the credibility of science, and to safeguard the general public living with its overt and covert influence, to filter good science from bad. Some inferences are good, even when counter-intuitive or seemingly inconsistent, and are likely to withstand scrutiny and some others may represent marginal effects in the aggregate not entirely useful for individual choices or decisions, and are often non-reproducible.

In this tab we have a number of calculators to evaluate reported statistical inferences. The first calculator provides an inverted simulation for continuous data, survival data and for discrete response data. This calculator is pedagogical to illustrate why we need to be wary as individuals about reported signals detected in studies using stochastic data, even when these aggregate signals are of a large magnitude. The calculator is based on simulations in three limited contexts for the three types of data. A longer commentary here uses this calculator in a discussion about ecological fallacies in aggregate statistics from cohort studies. The other calculators allow you to enter data from the study you are evaluating to determine the validity of reported analyses.

The second calculator does a post-hoc statistical power and effect calculation. We report the proportion of subjects having an effect in the group or an intervention deemed inferior, in a direction opposing the aggregate inferiority assessment for studies involving continuous, survival and binomial endpoints. A fourth spreadsheet in this calculator evaluates reported correlation coefficients. The third calculator evaluates likely publication biases infecting reported data. The fourth calculator computes the expected p-value in prospective, designed studies. The fifth calculator assesses dichotomous response based on improvements crossing thresholds. The details on the calculators have been published in this hyperlinked article – Srinivasan, S., 2018. “Evaluation of Reported Statistical Inferences.” Journal of Mathematics and System Science 8 (5): 140-52.

My name in Chinese in the panel to the left courtesy a participant of Chinese Origin at a 4H fair in Somerset NJ. The panel to right is a representation of the hexagrams from the Chinese book of changes – the I Ching. The Wiki page from which this panel is excerpted has an interesting discussion on the I-Ching and the 17th century philosopher and Mathematician Leibnitz!