He does seem like the kind of person that if someone told me his own mother hated him, it would be hard not to believe it.
Calling what machine learning algorithms do reasoning is almost as wrong as referring to them with the pronoun whose.
This sound ripe for a Little Britain or Mitchell and Webb look spoof.
Well, then that’s not what they really think or want, is it?
Oh sure. Find a way that makes me root for killer AI.
One of the requirements for special forces is the ability to follow orders.
I don’t know what the heck he means by process of ultimate learning, but this is such a classic “if the government isn’t doing it, it couldn’t possibly exist.” We’ve been studying its implications since someone first fantasized about a weird thing called a golem. We’ve been exploring ethics related to AI pretty thoroughly, just not in the form of a government sanctioned blue ribbon panel, probably the least useful of all forms of intellectual exploration.
I assume the proper kayak is there to rescue the rowers?
Most of our “AI” is basically chains of if statements combined with brute force linear searches of bulk data. Like combining “strings” with “grep”. Its as intelligent as a human with neurons hard wired at birth.
Thanks for the history lesson! I didn’t know that.
I find this passage particularly curious:
What will become of human consciousness if its own explanatory power is surpassed by AI, and societies are no longer able to interpret the world they inhabit in terms that are meaningful to them?
The Enlightenment is usually the development that gets blamed for annihilating people’s superstitious little hovels of ‘meaning’ and ‘society’ in favor of either epistemic nihilism and systematic doubt or dry, unsatisfying, watchmaker deism-buddy deities; along with rationalized bureaucratic mass standardization of formerly idiosyncratic organic community.
This argument is generally not without merit; though often advanced by the most disgustingly retrograde sorts whose ‘meaning’ happens to involve especially atavistic epistemology and the allegedly god-ordained authority old men in funny hats(often the same people advancing the argument, no doubt a huge coincidence).
Certainly the majority of people enthusiastic enough about antimodernism to be taking up arms about it are peeved about what it did to the purity of their premodern situations; and pine for the return of a social order that produces minds too small for doubt; but brimming with ‘meaning’, whatever that is.
I’m not sure whether to attribute this to the fact that Kissinger’s cultural context is heavily embedded in the rationalized bureaucratic and technocratic late enlightenment entities so he has no nostalgia for the romanticized past and a cranky old guy’s dislike of what kids these days are up to; or just to a cranky old guy’s dislike of what kids these days are up to; with enough motivated reasoning to fill it out into something vaguely more respectable sounding than that.
I would also be very curious to know if his horror of “retrieving and manipulating information” rather than “meaning” was present back when he was working on some of his better known projects:
One would think, for instance, that someone deploring brute, meaningless, information retrieval might have had something to say about McNamara and his (very cutting edge; if distinctly beta in practice) giant surveillance network guided airstrike plan; or McNamara’s general enthusiasm for systems analysis in government, private, and military applications.
Was he always of the opinion that good, old-fashioned, Machiavellian intrigue was the path of humanists and unnerved by the tech kids? Was that situation familiar and OK; but kids these days with their noise music and neural nets are just a bridge too far?
I’m not exactly sure what it would look like for ‘a nation’ to ponder something; but his proposal for a three-step process involving both ‘conceptualization’ and an absolute or ‘ultimate’ end state where the assorted bits and pieces previously conceptualized are somewhat-mystically unified sounds prettly hegelian to me.
It’s not strictly required(and many of the Hegelians who remain in philosophy are quite harmless); but it seems to be pretty common for hegelians in politics to develop grand theories of history. posit the necessity of state-level actors to play vital parts in the trajectories of these theories; and subscribe to the position that you can’t achieve a synthesis without breaking a few eggs, so to speak.
The stuff the machine learning guys are doing is disgustingly baconian by comparison: “Yeah, we just throw training sets at it until it starts generating useful outputs. No, nobody ever gains access to ‘meaning’ or some sort of underlying order. What are you talking about anyway? But, yeah, you definitely can’t understand hard enough to beat it at Go; so there’s that.”
I hope I live long enough to find out that AI research leads to ultimately realizing that this is us. It’ll be good for a big hearty laugh. Great job guys, you created AI. Now we have all these things addicted to heroin and moping around talking like Henri the Cat.
I’m sure eventually AI autonomy will become a problem, but from my perspective, the immediate danger is that the people controlling what AI’s do will be rich power-mad sociopathic a**holes like Kissinger.
After all, isn’t our nuclear arsenal more fearsome now that we have a drunken insecure fratboy at the wheel than when we had a more rational (but still hardly humanitarian) president?
My point was that you can’t pack up your moral obligations and hand them off to your sergeant. Humans are incredibly complex creatures, full of opposing motivations and emotions, sure. That said, If you are training death squads, or dropping napalm on Cambodian children, I’m not all that interested in your interior life. The majority of war crimes aren’t committed by sociopaths who joined up specifically to bayonet children, but by normal people who are ‘just following orders’. We don’t need to hand these people any more excuses.
“back in my day we didn’t abuse numbers like kids do today with their fancy computers”
Ironically, Dr. Strangelove (whom a few posters have referenced) is probably based not on Kissinger but on John von Neumann, without whom AI as we know it would probably not exist.
Maybe we need a thread for that.
You should see the portrait in his closet.
The world will be a better place when he’s finally rotting in a hole.
Yes and no. JvN was positively no Nazi that had switched alliances.
The bit where Dr Strangelove explains how nuclear deterrence works - that’s pure Johnny.
Purely by chance, two of the Pythons, I think Idle and Palin, shared an elevator ride with our dear Henry in New York in the late 1970ies.
They tried very, very hard not to burst out in howls of laughter under the suspicious eyes of the security detail.