Check out how far we’ve come in just 1 year:
We’re looking at curated prompts from a public facing company as they prep for commercialization. What about the State actors? China, for examples, has most of the worlds AI researchers.
While transformer networks are going to lay waste to worker jobs in a variety of industries, they don’t bother me nearly as much as some of the current pandoras boxes being explored in academia around autonomous systems. There’s a great deal of work being put into chain-of-reasoning, world model building, and other kinds of algorithms that can learn without human intervention. (See the “Dreamer” class of algorithms, for examples.)
Some of this work is already being leveraged for so called “Slaughterbots”. There are military industrial contractors working on this right now for Ukraine. (ref: War in Ukraine accelerates global drive toward killer robots)
To quote G. Hinton:
If you want a system to be effective, you need to give it the ability to create its own subgoals. Now, the problem is, there’s a very general subgoal that helps with almost all goals: get more control. The research question is: how do you prevent them from ever wanting to take control? And nobody knows the answer.”
He then adds:
(Control, he noted, doesn’t have to be physical: “It could be just like how Trump could invade the Capitol, with words.”)
(ref: Why the Godfather of A.I. Fears What He’s Built | The New Yorker)
I will argue as others have in the past, that the purpose of war is to change minds. Classically, this has been done through force. It is not the only way. Why launch a nuke, risking retaliation, if you can just convince a population to dismantle itself at your bequest?
Tools like these video generators, once combined with other more autonomous control systems, have a very real potential to rewrite our cultural notion of what is and isn’t real.
They have the power to change minds.
This topic was automatically closed after 5 days. New replies are no longer allowed.