IMHO the Dark Forest theory has its own problems, mostly related to its starting assumptions and oversimplified worldview.
It basically assumes that growth continues exponentially and extrapolates from there. Sort of like asking “I killed a fly when I was 6, knowing that flies lay about 150 eggs and take 10 days to reach sexual maturity, how many flies would there be in the world today if I had not killed that fly?” Then Liu does the game theory and decides that the only reasonable thing to do is obsessively kill every fly in the world before they have a chance to cover the earth in a mass of flies thousands of miles thick.
And then of course the logical conclusion is that you can’t kill all of the flies by normal means, so they only way to prevent them from killing everything is to completely incinerate the earth with nukes and go live on the moon. But then moon flies appear and you have to do it again… The whole trilogy breaks down into reducto ad absurdum but the author tries to play it straight the whole time.
I am reminded of other iteration problems, for example The Bottle Imp by Robert Louis Stevenson:
“The bottle must be sold, for cash, at a loss, i.e. for less than its owner originally paid, and cannot be thrown or given away, or else it will magically return to him. All of these rules must be explained by each seller to each purchaser. If an owner of the bottle dies without having sold it in the prescribed manner, that person’s soul will burn for eternity in Hell.”
I don’t think your description has much fidelity to the dark forest theory—i.e. that silence is an optimal civilization-level survival strategy on the interstellar stage. And whether you like the books as a narrative is largely irrelevant to the theory itself. There was much I disliked, and much I liked in the trilogy, myself.
Hadn’t heard of the short, that was an interesting read on the wiki. It sort of reminds me of a movie. Hmmm… let me see if i can find it. Yeah thought i had it right. Brewster’s Millions
It’s basically the same survival strategy as moving into a bunker, closing the door, and never ever opening it again. Liu comes to the conclusion because his starting assumption is that all other species are strictly xenocidal. The bunker strategy does make sense if literally every single person you meet will immediately try to shoot you dead.
If those aren’t the conditions then the bunker strategy doesn’t make sense.
Liu assumes the extreme xenocide because of the starting assumption that unchecked exponential growth means any species you don’t immediately kill will consume all of the resources in the galaxy in short order.
I agree. The point is that you shouldn’t expect too much from a radio survey, and you definitely shouldn’t take a null response from a radio survey as evidence proving “welp, there ain’t nuthin’ out there.”
Better critics than us have said the same thing, but that’s almost ad hominem. Just because an idiot says something isn’t proof that it’s false - in fact, Donald Trump said something true once. Probably.
All the study really says (as opposed to what journalists - including writers of press releases - read into it) is that those using Drake and Fermi to come up with large numbers of advanced technological civilizations need to tone down their guesses. (1) They’re only guesses, and (2) no one here yet knows what a good guess might be. The study didn’t conclude that we’re alone. It merely said that we’re somewhat more likely to be alone than some optimists have estimated in the past using the Drake Equation and the Fermi Paradox.