Even I, a human, know this is missing the point of what cats like about keyboards, which is mainly to do with the small rodent-like finger movements they attract.
That’s an interesting topic. I sometimes think about the profligate way we spend processor cycles - e.g. having a data center recognise the phrase “alexa 50% volume” rather than press a button - and I wonder if that’s something that will horrify future generations. But if you’re going to be heating your home anyway, you could have 50lbs of old phones maxing their CPUs encoding video (or whatever), and that computation would be 100% efficient since you’re using all the waste energy.
This also prevents the foodmonkey from figuring out how the cat manages to launch (or shut down) multiple programs with a few quick, seemingly random pawstrokes.
The main practical limitation is finding things that are worth doing; but which you don’t need (much, if any) access to during the summer. As you note; computer heat as as efficient as any other electric heat, and gives you features that the more basic ‘big resistor’ school of electric heating does not.
I know I’ve got a few of my less practical obsolete servers out and prepping for fall that will get a lot more use once their thermal output is a virtue, or at least not a vice; but because of the ‘not so much during summer’ problem they are there because they cost me almost nothing to acquire and tinkering amuses me; any actual computing power I want I either have to suck it up and pay for a more efficient option that won’t roast me alive half the time; or just admit is excessive and not use.
(My current conundrum is with excess in storage: 10GbE (or fibre channel, if that’s your fancy) has gotten delightfully cheap on the used market, so I’m tempted to replace my primary computers’ bulk storage with a nice ZFS array; but if I do that I’m pretty much committed to keeping that system up and running if I want to use one or more of the others, which is a problem; but locally attached storage just feels so crude…)
In the vein of tensions, datacenters are sort of an odd case: the hyperscale types are likely to be the most efficient guys around(barring things like ultra-long-duration sensor microcontrollers and such): when you run ten zillion servers the engineering costs of even slight improvements pay off rapidly; and if you are doing low margin things(like commodity HTTP-shovelling) you can’t afford to be wasteful.
Of course, when you are so lean and efficient, you can keep the lights on by doing things that are just slightly above abjectly worthless; which makes for horrifying waste in the broader sense of ‘our society spends how much annually on automated banner ad auctions?’.
Yeah, the problem (as with all the other forms of heat recycling we could be doing) is the logistics rather than the technology. But it occurred to me after I posted that that with relatively high-performance electronic devices, there might actually be mileage in this. Phones and computers (and game consoles and routers and set-top boxes) can burn through quite a lot of power in a small space, and a 2kW heater made of old phones could do a lot of computing. And it wouldn’t be too hard for regulators to require devices to be made with this kind of reuse in mind.
For reasons not clear even to me, I made a fun diagram
Another problem is that for heating 100% efficiency is quite low. With a heat pump it gets more like 300%, at the cost of increased complexity.
Having said that, I like your idea of moving waste heat from computers to water heater - currently most water heaters are still not equipped with heat pumps, so it makes sense to get any additional value out of electric energy that would be spent on heating anyway.