Meet the people who've volunteered to die on Mars

That’s part of the issue but it’s not like nobody’s been spending money to develop aerospace technology over the last 50 years—just look at the U.S. Defense budget! For most forms of technology the rate of innovation tends to plateau as the technology matures. The most revolutionary advances in powered flight all happened within 50 years or so of Kitty Hawk, most of the progress since then has been incremental improvement of existing technologies.

There’s also the matter of diminishing returns. For example, if you want a train that can safely travel twice as fast you’ll have to spend more than twice as much to build it.

The face on Mars: Kanye!

…you’re joking, right?

Actually, it doesn’t.

1 Like

This is generally very true, and as a programmer I’m especially aware of it. The history of commercial software development has in many ways been the history of realizing that you get better programs faster if you move fast, make mistakes, and fix them on the fly than if you try to design a perfect program on paper and rigidly implement the spec.

The problem is, when you uncover a major bug during beta test, it doesn’t generally kill the entire dev team. It’s okay to iterate minor details on the ground on Mars, but the core systems need to be absolutely rock-solid from day 1 or everybody dies. Public beta testing doesn’t work when you’re nine months from resupply.

4 Likes

That’s why you use more teams.

1 Like

As mentioned previously, one of the biggest challenges for this project is collecting the billions of dollars worth of resources to make it happen.

Ethics of killing human explorers aside, what do you think will happen to the funding for this venture after the first few teams quickly die for lack of planning and/or proper safeguards? How many astronauts can you send to their doom before governments start stepping in to shut you down?

If it’s long-term success we’re concerned about then there’s no need to rush the mission to fit an artificial timetable.

2 Likes

I think I understand what you’re getting. Please correct me if I’m wrong.

Technological progress can be described in terms of increasing the effect of human action: doing more work. You can do this by multiplying force, as with a classic simple machine, or by harnessing greater amounts of power. And it’s been power generation that’s really had the greatest impact in the last few centuries. So if we can achieve commercially viable fusion, that might have an impact comparable to past breakthroughs. But there’s nothing else comparable on the horizon, and we’ve yet to achieve that one.

If you have the supply of volunteers willing to take the risks, there’s enough governments in those smaller Africa states that will be willing to cover for you in exchange for some technology transfer.

Three cheers for jurisdiction shopping!

True that. But you can always let the overly optimistic schedule slip. Look at Kickstarter, it works there.

The AI Revolution: The Road to Superintelligence

This works on smaller scales too. The movie Back to the Future came out in 1985, and “the past” took place in 1955. In the movie, when Michael J. Fox went back to 1955, he was caught off-guard by the newness of TVs, the prices of soda, the lack of love for shrill electric guitar, and the variation in slang. It was a different world, yes—but if the movie were made today and the past took place in 1985, the movie could have had much more fun with much bigger differences. The character would be in a time before personal computers, internet, or cell phones—today’s Marty McFly, a teenager born in the late 90s, would be much more out of place in 1985 than the movie’s Marty McFly was in 1955.

This is for the same reason we just discussed—the Law of Accelerating Returns. The average rate of advancement between 1985 and 2015 was higher than the rate between 1955 and 1985—because the former was a more advanced world—so much more change happened in the most recent 30 years than in the prior 30.

So—advances are getting bigger and bigger and happening more and more quickly. This suggests some pretty intense things about our future, right?

Not arguing with anything you said, FO, just continuing the convo.

Hopefully they do A/B testing - one group above ground, one group below; one with super-efficient air scrubber, one without; etc.

1 Like

One with air, one without? :stuck_out_tongue: :smiley:
(Could be part of the reality show; who gets voted out from the group A to the group B…)

1 Like

I wonder if your reaction to that passage is similar to mine.

My first reaction is a subjective, aesthetic one: the present moment, 2015, is more like 1985, than 1985 was like 1955. When I was in my teens in the 80s, I found music from the 1950s almost alien; my step-son in college, and his friends, listened to a mix of music from the 90s and later, without ever referring to 90s music as nostalgic. It’s similar with other sorts of entertainment. I’d expect a lightly comic movie comparing daily life in the 1980s to the 2010s would inevitably make a lot of jokes about smartphones and tablet computers and the Internet… and then quickly run out of steam. If it was more edgy comedy, there’d be a bit about how same-sex relationships have become more generally accepted.

And even that only goes so far. In the 80s, we had Walkmans and handheld electronic games, and even personal computers. The very devices we tend to cite as evidence of the rapid advance of technology fit neatly into cultural – and often literal – pockets that held gadgets thirty years ago. Someone from 1985 would be quite impressed with an iPad, but it wouldn’t be alien.

When I was in college, my favorite professor asked the students, out of idle curiosity, when we thought the “postmodern” era began; a few people said, “the mid-Sixties”, and everyone murmured assent. And it struck me at the moment, and still remains my feeling, that when I look at depictions of daily life in the US in the middle to late 60s, it seems fundamentally familiar, like a picture coming into focus. The details are different, but the fundamental patterns seem familiar, and seem to have remained stable since.

Beyond that subjective, aesthetic sense, I’ve been inclined to argue that social and political change has slowed, dramatically, in recent decades. One of my greatest fears is that the modern US ruling class is one of the few in world history to accomplish the fundamental goal of every ruling class: it’s effectively defeated the productive class, guaranteeing stagnation until this civilization collapses, and that neoliberalism was the killing blow.

More narrowly, to respond to that article: it claims that we’ve actually accomplished considerable advances in AI, but that’s a matter of shifting the goalposts. The early promoters of “hard AI” in the 60s pretty much thought we’d have robots like R2-D2 by now. One book I read suggested that there was something of a debate among DARPA-funded computer scientists over whether they should concentrate on creating AI that could outright replace human workers, or on technologies to “augment” human workers; the latter prevailed, since “hard AI” didn’t appear to be getting anywhere. The “augmentation” vision went back as far as the 1940s, with ideas such as portable devices that could contain an entire library, and led directly to the Xerox PARC lab’s early experiments with graphic interfaces in the 60s – the basic functional elements of which are present in almost all GUIs now.

Contemporary speculation about superhuman AI seems to regard it as if it will happen almost automatically, once the number of calculations per second gets high enough – the very name “the singularity” seems to reference the inevitability of a black hole forming when enough matter is packed densely enough. Black holes shouldn’t be the analogy – more like the origin of life on Earth, a circumstance we can only theorize about but we assume must have been contingent on an enormous number of variables.

And that still wouldn’t make it viable to colonize a planet that makes Antarctica seem like Hawaii.

1 Like

Lol, you are correct.

It does, when used properly.
(It’s virtually never used properly.)

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.