OpenAI abruptly fires CEO Sam Altman for lying

It would be interesting to know what the split, in the cause of various tech CEOs being peeved, is between Altman being on the side of rash commercialization, which is where the money is(Nadella, in particular, has been front and center in having Microsoft become the de-facto owner of OpenAI to the degree its organizational structure allows); and how much it’s a tribal thing: When it’s for-profit boards rammed full of c-levels overseeing people they view as peers and colleagues you almost never see a firing that nuclear unless the cadaver dogs or the forensic accountants leave them absolutely no choice. Even the reasonably lurid sexual impropriety cases often get the opportunity to resign rather than get pushed; and the garden-variety difference-of-strategy/fuckups routinely get retained to oversee the transition. Altman, by contrast, got fired almost like little people; which probably doesn’t play well among those who identify culturally with him.

8 Likes

There was money involved (how could there not be?), but if TESCREAL ideology played a part, it’ll get very weird.

8 Likes

Weighing Options Are You Sure GIF

Oh, but I hear he’s a mensch, so… how can that be true! /bitter SSSSSS

15 Likes
3 Likes

Yikes. Any fans of Yudkowsky are going to be nutters.

2 Likes

I just don’t know if I buy, on the information we have thus far, that there are two sides and only two sides. I’m not exactly for Microsoft getting full control over OpenAI’s products and opening up the commercial streams on every front… AND I don’t identify as a TESCREAL nutter. But it seems like everyone in tech is already deciding that this is what happened here.

Ok. Agree and reverse that. Sama was clearly trying to commercialize. But this was a very personal firing in my opinion. Unless other, really damaging behaviors emerge, no responsible board fires their CEO with such a lack of care for the corporate reputation and their partners unless the firing is a personal grievance connected to an internal power play. This should be embarrassing to everyone involved, and sama has a real grievance here. Likely legal repercussions. Of course if they really did just invent AGI, and sama indicated an intent to monetize, that might cause people to act without caution if the board is AGI doomers. But I’d think even in that case it would be an argument best worked out behind closed doors. This reminds everybody of Jobs of course, but perhaps another example is Gary Gygax at TSR back in the 80s.

As far as those allegations, this post covers it pretty well:

They seem to have some troubling connections there, my friend… Not sure if that’s the best “evidence” to post…

10 Likes

this is not the best link, still looking around

7 Likes

I guess your experience doesn’t include cooking the books, putting the company at risk of large settlements, doing something that could get them in trouble with the cops or the SEC or lying to the board.

Things that the company might not want to advertise.

8 Likes

But I hear he’s a mensch! /s

6 Likes

Until some more information comes out about that, I’ll just stick to my original suspicions.

6 Likes

tldr;

So why would our tech barons, and their legions of anonymous libertarian groupies, create these depraved scenarios? They fear that a true AI will be too much like them. Like Rush Limbaugh claiming to have had a nightmare about being a “slave building a sphinx in a desert that looked like Obama,” privileged people who fear an AI rebellion always imagine it in exploitative terms that mirror their own ideologies. They fear their ethics being turned back on them.

10 Likes

As silly as the TESCREAL-informed doomerism is, I’ll admit I prefer it to the more dour and blunt-force global thermonuclear war doomerism that most of us over age 50 grew up with. As least it involves techno-utopian wanking and esoteric philosophical debates along with the “ZOMG we’re all gonna die!”

That said, the muse Clio has proved herself a prankster over the last decade, so global thermonuclear war it might be anyhow for the apocalypse. I doubt many of us anticipated another old-fashioned tanks/artillery/infantry war in the Eastern European Bloodlands either, but here we are.

Or like the “Slow AIs” they serve and profit from.

Stross thinks that corporations are “slow AIs” that show what happens when we build “machines” designed to optimize for one kind of growth above all moral or ethical considerations, and that these captains of industry are projecting their fears of the businesses they nominally command onto the computers around them.

12 Likes
6 Likes

Please. They seem upset that he was pushing commercialisation faster than they were comfortable with (if only in terms of the regulatory issues they suddenly had to confront). They also may have been annoyed that he was allegedly using his role and prominence in the company to build positions in other companies, some of them potential competitors (though why anyone on the board would be surprised by a co-founder of Y-Combinator doing that is a mystery).

That and the nature and timing of the termination announcement indicate that none of that is really personal, and suggests that the threat of legal exposure re: shareholders/stakeholders (one that would outweigh a wrongful termination suit by Altman) might have played a part in this decision. Whatever lack of “consistency and candidness in his communications” existed, they clearly felt it might have a substantive negative impact on the company.

In any case, the group most devastated by his firing is the lazy-minded tech press. They spent the months after Bankman-Fried was exposed as a crook trying to slot in Altman as their next ubiquitous “golden boy genius tech entrepreneur” – a non-grifty Sam 2.0. Now they have to start all over finding their next “Sam”. I suppose the fanbois trying to make “sama” happen (so fetch!) are also having a sad today.

7 Likes

Three days after the article was published, someone asked Yudkowsky on social media: “How many people are allowed to die to prevent AGI?” His response was: “There should be enough survivors on Earth in close contact to form a viable reproductive population, with room to spare, and they should have a sustainable food supply. So long as that’s true, there’s still a chance of reaching the stars someday.”

jeezuz!fucking!christ! nutjobs everywhere!

6 Likes

Sites like Less Wrong and Slate Star Codex tend to be a draw for the kind of privileged and credentialed white guys who in an earlier age would have gushed over Malthus and Galton. In the best-case scenarios for these sites, their founders (usually Californian Ideology types) continually have to kick out fascists and racists and misogynists who see validation for their toxic views in TESCREAL.

At least he acknowledges the need for an adequate food supply, which is something (along with bear spray) the AynCaps always forget in their fantasies.

8 Likes

Sarah Silverman Eww GIF by HULU

7 Likes

FTA:
"We give too much of our collective power away to white, cis men with money and privilege, then wonder why industries in which they dominate, like tech and finance, are so filled with apparently soulless egotists who feel entitled to bend the world “to their will.” We bend over backward to protect perceived genius, even when those “geniuses” are being accused of terrible wrongs, because we want so much to believe that someone, some singular, brave, usually male individual will be smart enough and strong enough to save us all.

What we need to remember is that we, together, can save ourselves, and that all-powerful “tech bros” are only as powerful as we allow them to be"

Fuckin A, comrade. Fuckin’ A

8 Likes