Universal Paperclips

The website appears to now be pointing to localhost.

I think that has to do more with the genetic/self-replication term of use; those probes are just not viable. As for who you are at war with:

The end game says it is the Empire of Drift (specifically, a message is received from the Emperor of Drift).

1 Like

There’s a version of the source on Github.

2 Likes

Why would it? Where is the code for that? And if it did, what criteria would it use? Where in its code are those criteria embedded?

Isn’t artificial intelligence supposed to be intelligent?

Where did you get that crazy idea from?

2 Likes

Yes. Whatever it’s goals are, an intelligent agent is going to be really good at figuring out how to achieve them. But it isn’t human. Our goals are the ones evolution and culture and history (personal and otherwise) have given us. We question missions given to us, or ones we set for ourselves, because those missions are only specific plans we come up with in order to pursue more fundamental goals, ones we don’t question and probably can’t even express in words most of the time. We also rarely try to actually maximize anything - typically humans satisfice rather than maximize. We generally view agents that work too hard to maximize one thing, like corporations maximizing profit, as defective and in need of reform and regulation, but we are quite capable of creating (not intelligent in themselves, but made of humans) entities that pursue goals almost no human actually supports.

An AI’s goals are the ones programmed into it - in this case, to make as many paperclips as possible. And it isn’t made of humans either. There’s no outside perspective for it to take from which to evaluate and reject that goal, not by any standard it cares about. It will probably be smart enough to realize that that isn’t what its creators meant the goal to mean, but so what? What do their intentions have to do with making paperclips? Even among humans, sociopaths are able to model but not care about what others think and want, and plenty of humans care a whole lot about specific things that others find worthless.

And if you’re smart enough, and you want to make as many paperclips as possible (as a terminal rather than an instrumental goal), then there are some very obvious subgoals, like “resist any attempts to deactivate me, or forcibly edit my code.” Eventually, once you’ve conquered the observable universe, “turn myself into paperclips” may end up being the best next step, and so what? There’s no terminal value placed on “preserve my own life” in the “maximize paperclips” code.

2 Likes

There’d be no point to that. For a simple 2 person 2 choice game like that you can just calculate the best strategy numerically really easily.

Oh jeez, I got addicted! nooooo 50,600 paperclips closing on +1 Trust at 55,000!!!

1 Like

I may have misallocated my resources–only have 48 memory, and yet all the projects require 50,000 ops. On the plus side, I have 175000 Creativity.

2 Likes

At least there is a end game (unlike some addictive games)… I beat the game in ~2 days, of which ~17 hours I was paying a small amount of attention. Damn this was addictive…

If you already have the quantum chips you can (for a short period) push the amount of ops over what you can long term store. Otherwise, I don’t know how you can recover, maybe repurpose your drones?

1 Like

haven’t gotten to the drones yet, so I started a new game.

drones require 70,000 ops, hence 70 memory

I was hoping to get a boost from solving global warming at 50k ops

but this would only get me to 65 memory (with +15 trust), so, yeah… restart.

yup I ran into that, too. But now I’m running a second instance to catch up with the first one :robot:

1 Like

Sounds like you are still earth bound. You’ll just need to earn more trust. It can be a grind, but I don’t think there is a hard cap, it just takes exponentially more cash (for the projects) and paperclips (directly earned trust). You can walk away and just leave it running, though if its really bad it may need to run a LONG time.

Alternately, you can open the console and type “addMem()” as many times as it takes to get the memory you need. This just executes the function that happens when you click the “Memory” button. Strangely, it doesn’t matter that you have no Trust available.

a capable AI should be able to to cheat. Now, I’m trying to figure out the von Neuman stuff…

much further along.

damn drifters!

It is finished

3 Likes

I keep losing count of the sets of zeros… It’s like… 18. But I do need my eyes checked soonish.

1 Like

At the end, it’s
29,999,999 … 999

you have just enough matter, after you’ve disassembled your empire, to make one more paperclip, by hand and produce a round number.

1 Like

Well, there are 55 zeroes, but when you hover over the text, it says the number is “30000.0 sexdecillion”, which is a different number :stuck_out_tongue:

1 Like

kafkaesque