Thanks. However, you’re misreading what I wrote, which was:
I’d guess that some of them who would be empathetic in other situations just declined the invitation to feel empathy for a pencil. I’d hope I would have (and I do have empathy).
I didn’t say that I hoped I would have felt empathy for the pencil. I said the opposite, that I’d hope I would have declined the invitation to do so (and thereby not fallen for the prof’s ruse).
And the reason I said so was by way of hoping in turn that a lack empathy isn’t the only reason that students don’t gasp when the prof breaks the pencil’s back (ha). I find hope for the future in my faith that some of us will continue to resist the temptation of feeling empathy for nonhuman (or mammal or whatever) trickery.
Related: I don’t have a lot of experience with chatbots yet but as for exhibiting empathy towards physical machines, I’ve seen firsthand just how powerful that impulse is even for the people who are building the machines, and these are people who definitely know better. Recently I was involved with a project that has some very cute robots that interact with the general public and it was a constant struggle to keep everyone focused on questions regarding machine safety, potential pinch hazards, etc. rather than worrying about how the various tests that we were performing might hurt the little guys or just seem too mean. And these robots couldn’t even speak english or protest their treatment. I don’t know what happens when in the future we have robots that emulate human reactions when poked or prodded. Maybe we’ll have to hire a bunch of sociopaths to do the testing?
… of course Turing didn’t either — computers of the ’40s and ’50s were glorified calculators
I don’t know if Turing thought of women as machines, or whatever Stross is trying to say, but I can’t help noticing that the scenario involves literally putting somebody in a closet
It fits well with other pioneering ideas about consciousness, like how Sigmund Freud discovered that people get horny and Joseph Campbell discovered that stories have plots.
the group describes how they pulled questions from the StackOverflow website and posed them to ChatGPT and then measured its degree of accuracy when responding.
That sounds pretty softball, but it still failed.
Stupid AI shitmongers started by claiming that no one should get into computer science or programming, except as a tech-priest to service the AI. All existing programmers should learn journalism.
Then AI was going to be the miracle assistant that’s like “demigod mode” for programmers.
But it’s always going to be like paired-programming with an idiot who never learns from experience. More of a time-sink than a help, plus being shackled to some company’s tool.
And you know what: there’s nothing magical about programming. It’s going to be the same for any task that requires more than an expert-system decision tree.
The sooner this bubble of rancid baloney pops the better!