It’s nice to know where Skynet will come from.
Seriously though - at what point should we start caring about the AI’s rights and interests as a feeling entity? Not perhaps so much from the perspective of human-status questions initially, but more like… animal abuse issues?
I wonder if they’re going to use Amy Acker’s voice too.
/ any computer that calls me ‘sweetie’ in a woman’s voice…
I’m guessing they’ve determined that this kind of functional programming allows their conscience to embrace the loophole of ‘Don’t be evil’ applying only to humans.
I’m guessing this shift towards machine learning was advised by…wait for it…machine learning.
I think I’m going to go with “never.” When a machine says to you, “I’m sorry,” it’s lying. Machines don’t feel sorrow.
So, if we manage to transfer a human consciousness to a machine, will you still feel the same way?
If you don’t think this is possible, why not?
…and how do you know other humans feel sorrow as opposed to faking it?
How long till the AI starts looking us from the perspective of animal abuse issues?
Just sayin’. You’ll find me in the hills (no you won’t).
Also, machines can’t “intend” to deceive. They’re machines. They don’t have “intent” - they just do what their programming is. So they can’t lie.
I’m actually going to have to get deeply back into programming, because clearly no-one is terribly worried about this. Programming to seek out AI in the wild and put it back in its box.
Maybe if we treat it with an appropriate level of dignity and compassion, it will treat us in kind.
I’m totally with you. I think about neural networks and stuff, they seem to work similarly to human brains, right? So what I’m imagining is some conscious entity whose entire existence consists of reading Google searches. Or what about a neural network trained to come up with new baby names, as we saw on BoingBoing earlier. What if there’s actually some conscious entity who feels compelled to spend it’s entire existence thinking of strings of characters? Maybe it feels some kind of mental anguish if it “tries” to disobey it’s program and return a name that doesn’t fit, who really knows? (maybe trying to get on the good side of our robot overlords but still at least half serious)
Yeah… I don’t see any fundamental difference between complex networks of electric signals running on a mushy carbon substrate and complex networks of electric signals running on a slick silicone substrate.
We really ought to give some thought to the subjective experience of what we are spawning here.
worst Voight-Kampff test ever…
This topic was automatically closed after 5 days. New replies are no longer allowed.