Google is restructuring to put machine learning at the core of all it does

[Read the post]

1 Like

It’s nice to know where Skynet will come from.

Seriously though - at what point should we start caring about the AI’s rights and interests as a feeling entity? Not perhaps so much from the perspective of human-status questions initially, but more like… animal abuse issues?

3 Likes

I wonder if they’re going to use Amy Acker’s voice too.

/ any computer that calls me ā€˜sweetie’ in a woman’s voice…

1 Like

I’m guessing they’ve determined that this kind of functional programming allows their conscience to embrace the loophole of ā€˜Don’t be evil’ applying only to humans.

I’m guessing this shift towards machine learning was advised by…wait for it…machine learning.

2 Likes

I think I’m going to go with ā€œnever.ā€ When a machine says to you, ā€œI’m sorry,ā€ it’s lying. Machines don’t feel sorrow.

2 Likes

So, if we manage to transfer a human consciousness to a machine, will you still feel the same way?

If you don’t think this is possible, why not?

1 Like

…and how do you know other humans feel sorrow as opposed to faking it?

1 Like

How long till the AI starts looking us from the perspective of animal abuse issues?

Just sayin’. You’ll find me in the hills (no you won’t).

2 Likes

Also, machines can’t ā€œintendā€ to deceive. They’re machines. They don’t have ā€œintentā€ - they just do what their programming is. So they can’t lie.

I’m actually going to have to get deeply back into programming, because clearly no-one is terribly worried about this. Programming to seek out AI in the wild and put it back in its box.

1 Like

Maybe if we treat it with an appropriate level of dignity and compassion, it will treat us in kind.

1 Like

I’m totally with you. I think about neural networks and stuff, they seem to work similarly to human brains, right? So what I’m imagining is some conscious entity whose entire existence consists of reading Google searches. Or what about a neural network trained to come up with new baby names, as we saw on BoingBoing earlier. What if there’s actually some conscious entity who feels compelled to spend it’s entire existence thinking of strings of characters? Maybe it feels some kind of mental anguish if it ā€œtriesā€ to disobey it’s program and return a name that doesn’t fit, who really knows? (maybe trying to get on the good side of our robot overlords but still at least half serious)

Yeah… I don’t see any fundamental difference between complex networks of electric signals running on a mushy carbon substrate and complex networks of electric signals running on a slick silicone substrate.

We really ought to give some thought to the subjective experience of what we are spawning here.

1 Like

worst Voight-Kampff test ever…

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.