Crypto mining involves solving complex math problems, and the more cores you can have working in parallel the faster you can solve them.
While a normal CPU might have 4 or 8 cores, graphics cards can have hundreds or even thousands of cores. The better the card, the higher the count of cores, and the faster they are. It’s also much easier to just slap in another graphics card versus adding CPUs.
At first I thought you meant, as in, “It fairly gives pause to one’s faith that the market will not fund things that are objectively pointless and harmful”
They could put out a firmware update to brick all of their previous cards and then corner the whole bitcoin mining market. It would be like a 21st-Century update to the plot of Goldfinger! I love it! Get Bruckheimer on the phone! Let’s get this movie made!
At the same time the Chinese province of Inner Mongolia has banned new crypto-mining outfits. That’s significant because that’s where most Bitcoin are mined due to cheap but incredibly polluting coal electricity.
If that “mining card” can be used as a general-purpose OpenCL/CUDA processor, it would be a lot more interesting than if it’s mining-only. In any case, it’s a good way to use up chips that didn’t pass muster for actual graphics but passed for compute work. Such binning goes on all the time with CPUs.
Of course, then you might wind up with… supply problems, as the miners and the AI/big data researchers compete for compute cards. Perhaps they would be less acute than the current shortage of full-fledged GPUs, though, and it would relieve the pressure on GPUs.
It should still be in stock, and it ships pretty quickly. Installation is a bit of a pain. Resolution and frame rates are a bit lacking. But no one is going to buy one for mining bitcoin.
It depends on the performance characteristics. I remember years ago Application-Specific Integrated Circuits (ASICs) usb sticks got popular for Bitcoin mining. As far as I know, they didn’t have other uses. If they did, I imagine even if it has great compute properties, bandwidth and memory would make it impractical for most other use cases.