An article (https://thinkprogress.org/san-bernardino-school-shooting-women-murdered-work-408a6572abf3) claims that women are more likely than men to be murdered at work. Here is their claim:
‘In 2014, 19 percent of women who died at work were murdered, just a percentage point behind roadway incidents. By contrast, 8 percent of male fatalities were homicides. While more men die at work— in 2014, 4,454 versus 367 women — women are much more likely to be killed. It was the leading cause of death for women at work in 2012, accounting for more than a quarter of all workplace fatalities, up sharply from 8 percent in 2011.’
Now, while I do not question their percentages, here is the problem, and it illustrates why understanding statistics is very important both when making claims and when evaluating the claims of others. Because the problem is, most people will just read the line ‘women are much more likely to be killed,’ as the basic conclusion (the rest of the article underlines the fact that this is the thesis which they are promoting). Yet, it is not true.
First: from http://data.worldbank.org/indicator/SL.TLF.TOTL.FE.ZS, we see that roughly 45.8% of the US workforce was female in 2014.
Second, 19% of the 367 women who died at work in 2014 were murdered, or 70 women. Whereas 8% of the 4454 men were murdered, or 356 men. In other words, the number of males who were murdered at work was 5.09 times higher than the number of women who were murdered at work, in 2014. Therefore, unless there are at least, or more than, 5.09 times as many male workers as female workers, men have a higher chance of being murdered at work! And in fact, there were only 1.22 times as many men in the workforce in 2014 as there were women.
This means men were four times as likely to be murdered as women while at work in 2014!
Unfortunately, these kinds of claims will be accepted by a large body of readers without questioning them, and then cited as a legitimate analysis.
There are many ways in which statistics can be used to prop up incorrect or misleading hypotheses. Often, this is done by carefully selecting which variables to use, or how to operatively define the variables, so that a correct statistical analysis may have no particular applicability to the question at hand. A good example of this might be economic analysis, where economists love to talk about things like the stock market’s value. This value has little to no impact on the actual life of the average citizen; it mostly affects the business elite. It is not exactly a statistical example, but it is a good example of how the financial health of the nation can be operatively defined in a way that makes it meaningless in terms of the benefits seen by the average citizen.
In this article, however, matters are worse, because they have made a claim which is simply not backed up by the statistics.
One should, therefore, always be careful about accepting these kinds of claims. If you do not follow the argument, consider asking someone with more maths experience to check it out.