There are plenty of amazingly valuable things which statistics and probability bring to our lives, from advances in health (drug testing, policy choices, treatment allocation, ...) to modelling (weather & climate, molecular dynamics, ...) to large-scale data processing (physics experiments, online marketing, ...) to...

What I'm trying to say is that the world is crammed full of fabulous and crucial uses of the mathematics of statistics and probability. But also, that it's full of people who don't know how to do it properly. Many of them journalists.

This is a guide for anyone out there who wants to really understand what's going on. (Alternatively, I might recommend this page to journalists as a good last-minute checklist before publishing a story with a % symbol in it somewhere.)

The First Problem: Saying What You Mean

...

Simpson's Paradox

Wiki...

Ecological Fallacy

Wiki...

False Positives and the Base Rate Fallacy

Wiki, Wiki...

Sampling Error

Wiki, your friends have more friends on average than you...

Independence of Samples and Clustering

Bad Science...

p-Values

...

Statistical Significance, Small Trials and Meta-Analyses

Bad Science...

Many-Tailed Tests

Bad Science...

Error Bars and Confidence Intervals

...

Averages and Quartiles

...

Selective Reporting and Multiple Comparisons

xkcd, Wiki

Hypotheses Suggested by the Data

Wiki, ...

Confirmation Bias

...

Hoyle's Fallacy

Wiki...

Berkson's Paradox

Wiki...

Gambler's Fallacy

Wiki...

Graphs

...

Monty-Hall Style Problems

...

N mistakes you really shouldn't make with statistics

Or, a checklist before you publish that article

top / xhtml / css
© Carl Turner 2008-2017
design & engine by suchideas / hosted by xenSmart