Yeah, the post title assumes it’s conclusion. The first question is whether it is even possible to “torture” a robot, not “Is it OK to torture a robot.”
It’s easy when they don’t make sense.
Try to make it more difficult. And less emotional. “Feeling” is not an argument. Nor a reaction worth consideration.
Oh geez. The video has NOTHING to do with torturing a robot. The developers are testing the robots ability to maintain balance while it is experiencing an unexpected perturbation. What good is a robot that falls over every time something bumps it? How hard of a push can the robot handle? The video shows that the software engineers have done a good job writing the control software. The robot maintains balance even when it has been pushed very hard. The same goes for the robot picking up boxes. If you purchased robots to work in your warehouse, wouldn’t you expect the robot be able to handle slippery boxes? What happens if the robot drops the box? Is it acceptable for the robot to simply stand there and do nothing because something unexpected happened? What would a person do? They would pick up the dripped box, put it where it belongs, then continue their task. That is what is being tested. How does the control software handle an unexpected input.
Of course. But tell it to the oh-it-looks-so-wrong it-makes-me-feel-bad gang…
REALLY? We had no idea!?!?
(or, rather, “duh!”)
they may not make sense TO YOU, but that doesn’t mean they are wrong or don’t ring true or make sense to others. You just don’t understand them. It’s not that you disagree, it’s that you completely dismiss others as not being worth trying to understand, if you don’t agree with them. It’s rude and frankly hurtful to others.
Like this comment here. This isn’t about how this makes us feel, it’s a wider question about robots and they role they may or may not fulfill in the future. We may never be able to make a robot/android that passes the test for being a sentient creature. But if we do, these questions actually do matter. And as I said (but of course, you completely ignored, because you could care less), science fiction deals with these questions regularly already.
But once again, it would be helpful if instead of being demeaning to people who disagree with you by completely dismissing anything they have to say that you don’t like or understand, you instead made an effort to understand them.
Well I’ve gotta ask you in return: Does this scenario bother you because you’re worried about the state of mind of a person who does that sort of thing for kicks? Or are you concerned for the feelings of the robots? Because I can get behind you on the one, but to me the other borders on absurdity.
Yeah, if somebody has converted their basement into a dungeon where they play out their torture fantasies, then I’m going to be a little bit concerned that they might one day want to graduate beyond fantasy. But I’m supposed to feel bad for a robot because it can’t “function as it would”? A toaster doomed to never again cook bread? A hair dryer with the element taken out so that it only blows cold air? A calculator used only to spell “boobies”? Poor things.
Oh right, “entities that appear to be experiencing something analogous to human experience”. How about a mannequin with a tape player in the back, with a cassette loaded in of someone crying and pleading “please stop”. That would have the appearance, in a vague analogue, of having a human experience. Am I supposed to feel bad for it? What if I swap out the cassette for one that says “oh yeah, hit me harder you bad boy”? Is that OK now? What if I swap out the cassette player for a tickle-me-elmo voice synth chip and call my mannequin a robot? Does that make it worse somehow?
To me, no. A microprocessor is a microprocessor. We may develop more advanced and humanlike software for the microprocessor to run but it’s still just a microprocessor. If you want me to care for a robot you need to show me something more than programmed responses. And yeah, if you develop a machine that cries when you kick it then I’ll be loath to kick it, because I’m an emotional being with a well developed sense of empathy and frankly it would make me uncomfortable. But the idea that you’d criminalize the act of making a machine simulate crying is still weird and foolish to me.
You’re scaring me now. This is the sort of puritanical thinking that ends up with children put into therapy for playing cops and robbers, using their fingers to “shoot” one another. The sort of thinking that leads to christian fundamentalists clamouring for Black Sabbath to be banned. Should we lock up the cast and crew of Schindler’s List for “simulating genocide”?
Play and fantasy are a crucial part of learning and development. My mum was probably a bit taken aback to see me playing GTA, renting out a street-whore and then running her down with a car to get my money back. But that’s an “experience” that has stuck with me, it opened my eyes to the potential there is in this world for depravity and I still think on it from time to time. I’d argue that for most people such exploration is not desensitizing, in fact it encourages sensitivity. In a world where violent computer games are increasingly available, violence continues to decline.
It’s admirable that the thought of violence makes you uncomfortable. But I don’t think you’re being rational about this.
When?
Yeah, who?
It should. But the appearance of sentience isn’t sentience. You can put a cute face on a calculator but it’s still just a calculator.
Usually when you, personally, can’t think of a response, it means the other person has made a better argument than you. The decent thing to do would be to admit that, instead of obfuscating with gifs. Or, you could explain how programming a robot to feel pleasure is “cold”? You seem confused.
Not to speak for @shaddack but it seems not only did you contradict yourself here but in fact this very much is about feelings.
It’s very easy to trigger empathy and sympathy in people using software. Ultimately software runs on hardware and is thus about things. Things which by nature are created, replicable and ultimately disposable/expendable.
Engineers must perform under a standard of professional behavior that requires adherence to the highest principles of ethical conduct.
Forget it, @anon61221983 … It’s Robot Town.
I propose something. I get an animated gif. After a while of pingpong of question-gif “answer” I get told that my proposal is “cold”, whatever it means.
It’s difficult to not dismiss a dismissive animation.
And I proposed a way to solve a concern. Only to be dismissed and not even told why.
When I do my half, it turns out that the people are less than cooperating with their half.
Most advices are sound but the thing was written by an incorrigible idealist. In real world one has to cut corners here and there to get paid or to get forward.
Simulated murder! Somebody call the cops!
The tendency to animated gifs here really is annoying
That, and given the existence of people who either can not see them at all, or cannot read the expressions or recognize the faces shown, it is also ableistic.
Some people should apparently check their privileges.
Is a pretty good point.
You seem to think that everyone else is as judgmental about people like you as you are about people who are not like you.
Most of us aren’t. We simply recognize that there is more than one way to be a worthwhile human being.
I’m not saying that people shouldn’t be able to stimulate murder in visual environments. It’s very much legal, and enormously popular today. I’m just wondering if our laws are going to react to the growing fidelity of our murder simulations.
The citizens of Los Santos aren’t suffering in any real way, that much is clear. Perhaps by Grand Theft Auto 9 the verisimilitude will be much more jarring for sensitive onlookers. I’m still not proposing that we prevent people from wreaking virtual havoc, just wondering when the Helen Lovejoys of the world will find virtual murder too much.
Virtual sex is already restricted by law to behaviour that’s legal in meatspace. No being suffers in any video-game unless we consider the player a victim, and yet this victimless crime has real punishment. I’m wondering when or if virtual violence will end up in the same situation, or if this uneven regulation will stand.
It probably depends on whether or not somebody comes up with a sick enough (to mainstream sensibilities) game that gets press. Then we’ll find out how the virtual 2nd amendment holds up…
I just tend to be annoyed to those who want to show off how “good” they are, how they “think about issues”, and know nothing about the underlying reality. And quite often decide about money flow, or even make laws. Or at least attempt to shape the discourse.
Okay with me as long as it does not stay in the way of engineering/R&D.
…which it rather often does, as way too many “ethicists” don’t have problems with prosthetic technology but suddenly it gets Wrong when the same tech is intended to enhance capabilities of “healthy” people. And annoying with questions like what is to be human, which don’t have answers anyway, at least not based on anything else but crazy handwaving. Of course, philosophers have mortgages too, so it is somewhat understandable, but neverthless they are acting as roadblocks and should be steamrolled over. Which is why we need lots of small labs where individuals or little teams can do the job with cheap, affordable instrumentation and locally synthetized pharmaceuticals (where needed), outside of the reach of influence of such old farts with an agenda and/or a warm chair in some committee.
…while military is usually already working on it, but in secrecy and with civilians benefiting with way too many years of delay…
Hopefully not at all. Or at least hopefully they will be as easy to ignore as downloading a file and running an installer.
Who have the freedom to not look.
As long as they won’t do more than just scream in powerless concern, it’s okay. It becomes worse when they start getting actual power.
Which is wrong. Politicos have no business to stick their grubby paws and snotty noses into what happens in the heads of people, and by extension in their writings and animated 3d models. Thoughtcrime is thoughtcrime.
Which is badly wrong.
Given the past behavior of various do-gooders, I expect it to at least locally be a matter of “when”. Then they wonder why they are so despised.
Deregulate the virtual and deregulate it NOW.
I am actually thinking about a peripheral, a configurable chassis that could be adapted to mostly any kind of a gun or rifle, with a pneumatic cylinder that’d simulate recoil, and a set of sensors for position/orientation reading. For immersive VR games or training scenarios. Could be quite popular as an alternative to having to go to shooting ranges and spend money on ammo. And is likely to piss off quite a range of hoplophobes and do-gooders. If most of the mechanics is 3d-printable and the electronics uses either webcam tracking of light points or those cheap 6 or 9 degree-of-freedom sensor boards for quadcopters or a laser pointer and a webcam or anything else that’s few-$ from Aliexpress, attempting to restrict its availability could be a popcorn-worthy exercise in futility.
“What do you feel when shooting virtual people?”
“Recoil.”
And?
They are humans so they often don’t do it anyway.