Just a nitpick: Moore's original idea was that the cost of a given unit of computing power will halve. So at some point consumers will (already have) just buy cheaper and cheaper machines until they become the supposed internet of things. The only plateau that will happen will be if chip manufacturers slow up their development efforts at reducing device cost in an attempt to adhere to the strict density formulation of Moore's rule-of-thumb.
What this means for the internet of things is that to keep revenues up, the internet of things will become a marketing gimmick to keep people replacing their badly-designed, poorly-made, consumer-quality crap. So not a quantum computing singularity so much as the same old crap, but with a supercomputer inside. A supercomputer in year-2000 terms. Not an AI. AI research will have to focus on structures and mimicking biological systems, and they'll be able to get past any "plateau" in Moore's Law by virtue of faster processing due to electron/photon transmission rather than neuronal biochemistry. So AIs will be physically large. No Spike-Jonze-style Samantha in a phone, but maybe a Samantha in a big lab but with a constellation of interface phones on a family plan.
I think Moore's Law may be more of an industrial technological norm: at the bleeding edge of any technology, when the entire economy balances on a particular technology and sufficient numbers of people think about and work on designing, building, and testing, cost per unit of awesomeness will fall by half every 18 months. Maybe there is another thing than cost that works in this General Rule of Thumb of Economic-Industrial Awesomeness. Productivity and market size go in there somewhere, too, in ways that aren't seen directly in "cost".
But then, Moore's law is just a rule of thumb anyway, and if it fails to predict the future, all it means is that extrapolation that never meant anything in the past (in terms of AIs and quantum singularities) will mean even less in the future.