Economics research considered unreplicable


#1

[Read the post]


#2

Will be interesting if they continue testing results (after psychology and econ) into the hard sciences to see if they fare better. My hypothesis is that they will.


#3

Oh that’s so weird Cory. I just bought and watched the DVD for The Black Hole by Disney the other day. How does the image relate to the article? Never mind, thanks for the image!


#4

Nothing coherent comes out of economics “research”?

An even better header image would be “shit in, shit out” - any suggestions?


#5

The paper isn’t just “economics = voodoo.” They make a slew of concrete suggestions for improvement.

EDIT: Oops, didn’t mean to reply to @renke. Soz!


#6

As if I would RTFA before commenting :stuck_out_tongue:


#7

I suspect we’ll be unpleasantly surprised, depending on the journal and the science. Look at all the medical assertions, especially those related to diet, that are frequently overturned because previous studies were done with poor methodologies, not-statistically-significant group sizes, etc. Though there’s also some overlap between “unrepeatable” and “repeatable but meaningless.”


#8

The Black-Schole?


#9

So shocking to hear that economics is rarely practised scientifically!

/s

A professor once pointed out that whereas monarchs of old surrounded themselves with astrologers, the rulers of today tend to surround themselves with economists.

I can’t imagine what he was getting at…


#10

If they are failing papers for not uploading their data, I suspect for physics and astronomy things will be rather a lot worse, because of the impractically huge datasets.


#11

My fiancee is an economist, her datasets frequently weigh in at around ~100 GB, and are proprietary and delivered to her in physical formats (things like partial zip files spread across a bunch of DVDs). so even if she could host the data herself, she wouldn’t be legally allowed to in many cases.


#12

I found the problem. One important part of science is reproducibility, data with NDAs attached is doubleplusungood.


#13

I’d tend to agree. I know the health data is very highly controlled (I think dept. of HHS oversees that), and a lot of the data that exists on things like credit default is owned by the companies that oversee the credit, and the data itself is a revenue stream for them.


#14

Okay, did I read that right? In 66% of cases they couldn’t replicate the results using the code and data provided by the authors? Like, the study says, “We ran this code on this data and got this result,” and then they ran that code on that data and got a different result? Then, when they got the authors to help them run the code the right way it only went up to 49%?

To put this another way, if an economist gives you code that says: “echo Hello World” there is a 50% chance they say that when you run it, it will print “People are driven by self interest” on your screen?


#15

I’m not surprised. Remember the Reinhart/Rogoff study? The one used for bringing down half of Europe with the 90 % debt level figure? Ineptly used Excel.


#16

[quote=“doctorow, post:1, topic:67367”]The most common cause of our inability to replicate findings is that authors do not provide files to the journal replication archives, which constitutes approximately half of our failed replication attempts (21 of 38 papers, 55%).[/quote]Seems specious. This is less like “we can’t replicate the findings” and more like “the authors won’t let us try to replicate the findings”.


#17

or “the authors used the only data available, which is proprietary and/or protected by privacy laws.” Probably not an exclusive or.


#18

This doesn’t surprise me, we’re swimming in an ocean of data, and human beings are pattern recognition and bias monsters. I’ve always found it odd how economics as a discipline is mostly post-hoc rationalization of prior events with little predictive power, yet it earned the title of “the dismal science.” Dismal, maybe. Science… maybe.


#19

But we’re talking about published studies. The data they used and their methods have to be available, otherwise how could they be peer reviewed?


#20

Reviewers can get access (on request) to confidential data that went into the study, that is not subsequently publicly published. Even if the data is non-confidential, it could simply be impractical to ‘publish’ the data (if say, it’s many gigs), even though it’s theoretically available for access.