You can call me AI

I’m struggling to put my finger on it, but I’m 99% convinced that NVidia is riding a wave based on exactly the wrong kind of fundamental algorithms for anything approaching what we would consider strong AI. They are architecturally incapable of the right algorithms, at least with any efficiency, leading to an explosive O(Nd) d≥2 cost to brute force the only that they can do in the general direction of what looks right. While numerous and inexpensive, NVidia’s graphics cards provide a decidedly finite number of monkeys, with diminishing marginal benefits.

When I figure out the correct algorithm, I’ll let you know… :thinking:

While my chances of that are low, one thing I can guarantee is that I’ll have to code it myself.

5 Likes