Oh, super! Mario is slowly becoming self-aware via AI

I just lost the game.

1 Like

Letā€™s take the ā€œbestā€ variant of the Basilisk Offer.

Box A, royal fun on a high-power project.
Box B, nothing.

A it is. Nothing to decide about.

1 Like

We call it ā€˜binary Buddhismā€™.

the thing is, the only fucking reason to couple a bog-standard rudimentary knowledge representation systems with a bog-standard rudimentary voice synthesizer is to give people reason to think in that direction.

so why else would they do it?

Iā€™ve heard that in Platoā€™s Peopleā€™s Republic the Interior Ministry forces can just stop you in the street, for any reason or none, and demand to inspect your qualia. Reports from the few dissidents who have survived The Cave suggest that itā€™s virtually impossible to present qualia authentic enough to satisfy them.

2 Likes

Iā€™ve been raising qualia at home from qualia eggs.

1 Like

Rokoā€™s Basilisk is intellectually equivalent to Satan or possibly the concept of sin, in that in order to consider it seriously you have to believe in a lot of articles of faith, including time travel, if the reality we exist in isnā€™t simulated already.

Iā€™m not convinced we live in a simulation yet, Iā€™m definitely not convinced time travel could ever work, therefore, I donā€™t really care about the Basilisk. Iā€™d rather think about how we could make the singularity even happen before Iā€™d worry myself about the horrors it could spawn. The singularity isnā€™t going to happen unless we make it happen. Not yet anyway.

2 Likes

So people could hear it? Speech is just an output device, like any other. Likewise, with computer text on a monitor, I donā€™t feel itā€™s an illusion of somebody drawing on the other side. Itā€™s just data translated for my senses. Bots are fun, and speech synthesis is fun, so why not? Rudimentary knowledge representation gets you unexpected connections. Rudimentary speech synthesis can offer linear predictive coding which is deliciously coarse and buzzy.

2 Likes

itā€™s a very dumb output device for this purpose; itā€™s slow, hard to navigate, and requires otherwise unneeded hardware. itā€™s used only to wow the easily-impressed.

there are no unexpected connections from their knowledge representation; there are a handful of slots (enemy, obstacle, etc.) that could be filled with a handful of sprites (goomba, wall, coin, etc.); you could write down every combination on an index card with a Bic, and have room to spare.

if they could mine the A*-search algorithm for insight, or even better design one which discovered them without so much a priori knowledge, i agree that would be cool. the speech synth would still be dumb though.

Thatā€™s not the choice. You donā€™t get to pick A, you have to pick either A&B or B.

I donā€™t think the boxes analogy works too well for the basilisk problem, which is why itā€™s awkward. Theyā€™re two separate problems.

One is the established Newcombā€™s box problem, where the crux of the issue is that in all possible worlds, picking A&B gives you a better outcome than picking B alone. If youā€™re going to pick B, then it would seem intuitively than you may as well grab that grand from A while youā€™re at it. (And then we make a paradox by magically making it so picking B alone is better.)

This is where the comparison breaks down: it just doesnā€™t map to the original problem, where itā€™s necessary for box A to be good. In the box representation of the Basilisk problem, box A isnā€™t necessarily a good thing. So you might have every reason to think picking box B alone is better than A&B ā€” Iā€™d prefer to do nothing than help bring about an evil AI.

I think trying to map the problem onto Newcombā€™s box just muddles it. I think the Basilisk problem is simply this: if you believe that either (a) weā€™re in an AIā€™s simulation, or (b) an AI in the future can look back at all your decisions, then you must work towards bringing that AI into existence or He will be Angry.

In that sense, the better analogy is simply Pascalā€™s Wager, with the nice twist that ā€œbelieving in God AIā€ is what causes the AI to be created.

1 Like

Makes me think of Yahtzee Croshawā€™s Mogworld.

Okay, I misread then.
Still, A sweetens the deal enough for me.

Donā€™t do it to Luigi.

Heā€™ll look at the title screen and realize that itā€™s ā€œSuper Mario Bros.ā€

Then heā€™ll realize that heā€™s ā€œplayer 2ā€, the one whoā€™s only there when somebody canā€™t be Mario.

And then the only thing heā€™ll be feeling the rest of his life is endless depression.

3 Likes

Nah, Luigi will be super obsessed with photorealism and plumbing intents, and keep releasing shared Mario routines 12KB at a time until plumbing is the super profession and life extension it was at first release.

It will mean our awareness has been modeled. Will the model be aware of our awareness, or its own?

If itā€™s a perfect model of a human brain with a working mind, then it should at least be self-aware. If it has the sensory hardware (or knowledge input) to know about other brains and minds, then it should also be at least able to be aware of our self awareness as well. I donā€™t see how a perfectly simulated brain would be distinct in itā€™s function from one of our squishy biological meat brains.

1 Like

Thatā€™s a tall order, because youā€™d have to model not only the brain but the lifetime of sensory data that goes into developing a human mind. Otherwise you just have the equivalent of a cloned brain floating in a jar.

So to get anything like a healthy human mind you either have to plug the simulated brain into a highly sophisticated android body and raise it from infancy or simulate an entire world for it to live in, Matrix-style.

Which is exactly what I was thinking. Besides, given the comedy of unintended consequences we humans love to generate via any number of ways (R&D, daily conversations, random interactions), the likelihood that weā€™ll awaken technologies that donā€™t necessarily do our bidding, or maybe they follow our orders but in ways we hadnā€™t previously considered, seems a distinct possibility to me.

Itā€™s either that, or maybe Iā€™ve watched Robocop too many times. I really donā€™t give a shit, as long as I get my 6000 SUX. Itā€™s an American Tradition, yall:

2 Likes

This topic was automatically closed after 5 days. New replies are no longer allowed.