Sounds like he needs to ease up on the acid.
/Degree in Computational Biology
/Sofware Engineer for almost 30 years, so much for that degree.
World leaders ink AI safety pacts while Musk and Sunak engage in awkward bromance
[…]
Bringing to a close the UK’s AI Safety Summit yesterday was the unedifying spectacle of Prime Minister Rishi Sunak interviewing tycoon Elon Musk, the world’s richest man.
Beginning with nauseating sycophancy, Sunak read Musk a Bill Gates quote, saying there was no one in our time who had done more to push the boundaries of science and innovation than the Tesla CEO. It’s perhaps a shame Musk doesn’t seem to share the same respect for Gates. Perhaps not.
[…]
Sky News TV journalist Sam Coates declared “it was all just mad,” involving 40 minutes of “softball questions” in which Musk agreed to take questions from business leaders, but not journalists.
Coates said Sunak didn’t see Musk as a political force prone to voicing views on Ukraine war and Gaza/Israel, with interests in the politically powerful Starlink satellite system, and failed to challenge him. Instead, Sunak focused on “selling Britain to this guy who is already the richest man in the world” in the hope Musk’s celebrity sheen might rub off on him.
“Half of Cruise’s 400 cars were in San Francisco when the driverless operations were stopped. Those vehicles were supported by a vast operations staff, with 1.5 workers per vehicle. The workers intervened to assist the company’s vehicles every 2.5 to five miles, according to two people familiar with is operations. In other words, they frequently had to do something to remotely control a car after receiving a cellular signal that it was having problems.
Very neato, but not even close to “working.””
If only someone could invent a system where they’d only have to devote one worker per vehicle…
And, hear me out, I’m, like, totally thinking outside the box here - what if this system would use vehicles that can seat more than just four or five people?!?
Juicero, now with AI!
Why didn’t they fire all those people along with the AI ethics ones?
Rearranging elements, even when there is a googol or more of elements to chose from, is still not the same as coming up with at least one original new element from… where? A misfiring neuron?
AI models are not very good at coming up with outputs outside of their training data.
I suppose they were going to get there eventually.
FTA (abstract):
Together our results highlight that the impressive ICL abilities of high-capacity sequence models may be more closely tied to the coverage of their pretraining data mixtures than inductive biases that create fundamental generalization capabilities.
How fortunate that they’ve phrased their results in a way that Google’s bosses are likely to overlook.
It’s a shame that, in today’s economy, there’s no money to be made in being right…
Who, me? Bitter? Nah…
Sounds good, but are they using cameras and tracking? Will I need CV dazzle for my car now for the inevitable expansion into Project Yellow Light, Project Not-So E-ZPa$$, and Project Po-Po Plate Info?