Amazon's AI-powered "Just Walk Out"checkout option turns out to be 1000 workers watching you shop

As I’ve written before, hallucinations are a feature not a bug. These models do not “know” anything. They are mathematical behemoths generating a best guess based on training data and labeling, and thus do not “know” what you are asking it to do. You simply cannot fix them. Hallucinations are not going away.*

e/ actually, in my view its not frightening, but rather encouraging; hopefully they all will abandon the shit they cant fix. but thats of course wishfull thinking I guess.

3 Likes