“Conwood” - good name considering.
Oddly, I feel like most people should be treating shocking events in the news as good news (or perhaps at least a silver lining). The fact that something is shocking implies that the event is an outlier, and that in the whole things are pretty good. When people no longer find it remarkable, that’s when you know things are getting bad.
Doesn’t the interpretation of a data point as an outlier (and our ability to detect an outlier) also depend on sample size? If you only have a handful of data points and a lot of variation, it’s hard to apply the guideline they give at the end. Throwing data out when you don’t have much data to begin with could also lead to misleading conclusions.
the key to understanding
Understanding what? Life, the universe and everything?
The key to understanding exactly what to which they hold the key to understanding.
I’m sorry, isn’t this the Tautology Club thread?
Except that the media constantly self-selects the shocking events they feel are relevant, so readers never learn about the vast majority of shocking events.
When you understand, then you will know.
This is a good point. But I don’t think we are advocating throwing out data when there is a small sample of data. I think we are saying that you have know your underlying data, and understand what might be generating the observations. it could well be with a small sample set you still have outliers. The real lesson I hope people take from our book is that knowing the data and thinking about it will help you craft better, more reliable analysis.
This topic was automatically closed after 5 days. New replies are no longer allowed.