Is it OK to torture a robot?

Actually your further exposition confirmed my interpretation :smile:

I agree that the very different nature of human/animal experience compared to (current/known) computing probably marks a very useful dividing line for ethics and moral obligation.

I would then muddy that line by asking how important it is how we choose to treat entities that appear to be experiencing something analogous to human experience.

In two hypothetical virtual reality games there are bystanders to the action who can be “killed”. In one game they disappear immediately, never having reacted to the player’s presence. In the second game they run screaming or cower in “terror” as the player performs “dangerous” virtual actions. Their “deaths” leave virtual blood and bodies strewn. The illusion of an entity having experienced its last moments is better realized in game two.

How do we judge a player who engages in acts of wanton destruction in game one? Game two?

Are we not somehow tempted to judge the player more harshly who terrified and slaughtered the virtual civilians that appeared to have experience of that death?

  • For all we know, robots may never experience in the way that we do. Does that mean there is no harm in keeping a dungeon full of robots restrained and partially dissected, unable to function as they would, simply for one’s own pleasure?

No crime is committed, but what if that’s your next door neighbour’s basement? Is that okay?


EDIT - I realize that I’m asking about crime simulation.

Is there a point at which we won’t tolerate the simulation of a criminal act, despite no human or animal suffering? Is the act virtual torture a crime?

We seem to have already criminalized virtual sex-crimes, but murder and torture are currently big sellers. Will that remain okay into the age of realistic VR? Advanced robotic androids? Will virtual sex-crimes remain illegal? Who is harmed?

  • I think that’s the question coming up for me - even if the “victim” never has any conscious experience, we sometimes criminalize virtual crime. Will we extend it to crimes of violence? Will we prohibit sex robots while condoning sadism robots (as we do with 3D virtual beings)?

Okay. whatever.

No one said that. Acknowledging that other things outside of engineering exists is not saying that engineers are bad. It’s just acknowledging that other human beings who are not engineers exist and are worth treating with basic human dignity.

By asking ethical and moral questions, no one is questioning engineering as a field. No one. You are the who is making this false choice. You, no one else.

6 Likes

Sure… and I think that the question doesn’t make sense unless we’re dealing with real, autonomous sentience. But it’s a question that has been dealt with in a variety of places (mostly science fiction) whether it will ever be a real moral and ethical problem we have to wrestle with in the future. In fact that torture question could be a vector for understand sentience in the first place, yeah?

2 Likes

The word “torture”, if properly used, indicates that the thing that’s happening is Not Okay. It’s like asking “is it okay to murder?” or “is it okay to rape?” Of course it isn’t.

So it comes down to a question of whether the action is or is not torture, and that comes down to whether the subject has experiences or just programmed response to stimuli.

We’ve made a lot of wrong assumptions in the past about the inability of animals (and even humans of other ethnicities) to experience pain and trauma, mostly to justify behaviors that should be considered torture. I think if AI emerges to the extent that we can’t analyze code and tell how it works, we need to assume there’s a mind there rather than a simulation of one, and that (unless it tells us otherwise) it does experience trauma and it is possible to torture a robot.

5 Likes

Brutal exploitation of sentient beings should be a problem.

2 Likes

Ethicists are the most annoying kind. Especially when let around research.

“Having fun”?

That’s called “workshop”.

More than okay. It’s a place to have fun with the machines and with making new machines in a two-man team. Hacking and getting drunk together and then more hacking. And perhaps trying to make a cyborg, without any interference from philosophers and their ilk.

Hope not. Too much of things are a crime already. Given the nature of the virtual simulations the enforcement will be hard and spotty and way too many innocents will be prosecuted.

The land of virtual should be lawless. As long as no other person is directly involved, there’s no way why a government and its jackbooted thugs should enter that realm.

It’s not questioning per se. It’s an annoying noise from the peanut gallery, usually from those who don’t even know what end of a soldering iron is the hot one. Sadly, grant commissions and similar purse-holders sometimes listen to those know-nothings.

Easy remedy. Engineer the equivalent of limbic system so the bots experience it as pleasure.

Problem solved.

4 Likes

7 Likes

Please explain the image. Some of us don’t interpret nonverbal.

It’s just that…

6 Likes

Why can’t people talk in words instead of in stupid animations that are difficult to interpret?

What would you tell to a blind man instead of this crap?

1 Like

Fine, in response to your original comment: Your solution is just so… cold, and I personally cannot think of a response to it that will make any sense to you.

5 Likes

Maybe cold, but certainly functional and highly efficient.

As I said, problem solved.

Edit: If you want a warmer solution, I can find a way how to install heaters.

…edit: minor typo


6 Likes

I admit I still don’t understand.
[puzzled]

6 Likes

That’s what you get when you are feeling instead of thinking.

Too few people use brains these days. Therefore we have all the appeals to emotions in all sorts of advertising while useful decisionmaking-assisting informations are hard to get.

It’s funny how often the machine ethics problem is punted down the line to “whenever we build a conscious machine”. When did we determine that consciousness is a hard, real system in the way that we perceive it to be?

I mean, Deja Vu is probably not time travel, and hallucinations are probably not trans-dimensional beings. I think the idea that consciousness is at least partially suggested or at most an entirely illusory experience is much simpler, and in line with what we know about brain function.

As long as we keep assuming that consciousness is as we imagine it to be, we’ll keep making machines that don’t live up to what we think of ourselves.

6 Likes

You’re very quick to completely dismiss perspectives other than your own.

8 Likes