You can call me AI

2 Likes

2 Likes

7 Likes

I should do more testing, but in my experience, itā€™s much worse even than that. I found that, when prompted for code to do a specific thing, the result would often consist of boilerplate code with nothing in the place of the asked-for capability.

I mean actually nothing, not even code that was wrong. Sometimes thereā€™d be a comment like // Do the prompt thing here.

From what Iā€™ve heard, GitHubā€™s Copilot might be more useful, though what Iā€™ve actually seen is essentially a high-quality autocomplete. Helpful, but not earthshaking.

As I say though, I should try some more actual testing. YMMV.

1 Like

A fancy autocomplete could be useful, but not when it sends your drafts to Microsoft.

8 Likes

Thatā€™s going to be a problem for a lot of places with security responsibilities.

6 Likes

ā€œQuick! Slap some shitty shiny AI on our product!ā€

Powered by buzzwords!

Today, Red Hat announced a developer preview of Red Hat Enterprise Linux AI (RHEL AI), a foundation model platform to seamlessly develop, test and run best-of-breed, open source Granite generative AI models to power enterprise applications.

1 Like

It doesnā€™t matter if itā€™s shit. Weā€™ll polish it!

6 Likes

Who stole whose property? The content creators might want a word here.

Companies once again trying to own stuff, that they canā€™t, if they want to keep Section 230 safe harbor protection. (Not that Iā€™m cheering for the AI scrapers.)

3 Likes

OpenAI Co-Founder Wojciech Zaremba: We have a Slack channel dedicated to the welfare of Artificial Intelligence.

1 Like

MS Word sure is a strange beast. The company pretty much perfected the word processor by the early 1990s and the vast majority of changes theyā€™ve crammed in since have ranged from the unnecessary to the outright infuriating (did anyone like Clippy?).

But the company canā€™t seem to leave well enough alone so now weā€™ve got bots sharing our private business documents, correspondence and horny fanfic with Satya Nadellaā€¦

3 Likes

No, Big Tech has distracted the world from the existential risk of Valley Bro con-artist billionaires.

9 Likes

saint-soap


???

gayweedanimal

bunjywunjy


sadcena

Google tells you to kill yourself
jaw

7 Likes

(Turing was very gay and had a sex-segregated education and then work life: he basically didnā€™t know women and his alienation is palpable. But todayā€™s techbros donā€™t have any such excuse, and the emphasis on femme-coded AI is ā€¦ telling.)

13 Likes

Only half his students had empathy is another way to look at that.

6 Likes
3 Likes

Hmm, maybe a few of them who didnā€™t gasp did have empathy?

Iā€™d guess that some of them who would be empathetic in other situations just declined the invitation to feel empathy for a pencil. Iā€™d hope I would have (and I do have empathy).

2 Likes

Would you also hope and expect to have empathy for an ā€œAIā€ algorithm like a google chatbot? That question, and the societal implications of it, was probably the point of the demonstration. Having empathy for something that seems (very superficially) lifelike is absolutely part of human nature for well-balanced individuals, but having such empathy for commercial products that absolutely are not worthy of your or anyone elseā€™s empathy may get us into trouble.

yeah yeah we know seth meyers GIF by Late Night with Seth Meyers

3 Likes

My apologies if that came off as mansplaining. I was just fixated on your statement about how you ā€œhopedā€ that you would have felt empathy for the pencil even though you know that it didnā€™t need your empathy and I wanted to explore that reaction. I would have no doubt that you are an empathetic person even if you didnā€™t care about the pencil.