Moore's Law may be plateauing

[Permalink]

Just a nitpick: Moore’s original idea was that the cost of a given unit of computing power will halve. So at some point consumers will (already have) just buy cheaper and cheaper machines until they become the supposed internet of things. The only plateau that will happen will be if chip manufacturers slow up their development efforts at reducing device cost in an attempt to adhere to the strict density formulation of Moore’s rule-of-thumb.

What this means for the internet of things is that to keep revenues up, the internet of things will become a marketing gimmick to keep people replacing their badly-designed, poorly-made, consumer-quality crap. So not a quantum computing singularity so much as the same old crap, but with a supercomputer inside. A supercomputer in year-2000 terms. Not an AI. AI research will have to focus on structures and mimicking biological systems, and they’ll be able to get past any “plateau” in Moore’s Law by virtue of faster processing due to electron/photon transmission rather than neuronal biochemistry. So AIs will be physically large. No Spike-Jonze-style Samantha in a phone, but maybe a Samantha in a big lab but with a constellation of interface phones on a family plan.

I think Moore’s Law may be more of an industrial technological norm: at the bleeding edge of any technology, when the entire economy balances on a particular technology and sufficient numbers of people think about and work on designing, building, and testing, cost per unit of awesomeness will fall by half every 18 months. Maybe there is another thing than cost that works in this General Rule of Thumb of Economic-Industrial Awesomeness. Productivity and market size go in there somewhere, too, in ways that aren’t seen directly in “cost”.

But then, Moore’s law is just a rule of thumb anyway, and if it fails to predict the future, all it means is that extrapolation that never meant anything in the past (in terms of AIs and quantum singularities) will mean even less in the future.

2 Likes

Actually, the article that’s linked to in Maggie’s post does specifically say that cost, not necessarily size or speed, is what’s plateauing:

A research note from the technology analysts at Credit Suisse (just returned from the 51st Annual Design Automation Conference, a show for electronics professionals) concludes that chips may be getting smaller and faster, but that the process no longer necessarily involves them getting cheaper.

I love how the comments in any BoingBoing post are as informative as the post itself. And now, to add my own 2 cents:

yo - what he said

People certainly have that expectation, and in some ways it has become a self-fulfilling prophecy for the reason you cite. But much of what’s happened in chip manufacturing has occurred because of the process by which integrated circuits are made - it’s been possible to continuously make incremental improvements to the photolithographic techniques that resulted in ever-smaller transistors and thus dramatic increases in processing power and reductions in prices. I can’t think of any other industry where that’s happened, no matter how economically important it was. Now that those sorts of improvements are no longer possible, the likelihood is that by switching over to a different basis for speed increases or a different technology entirely (i.e. not a silicon-based IC), we’ll not see those kinds of improvements, regardless of the resources thrown at it.

Due in part to the expansion in spending on R&D specifically to restart the density component of advances.

I was thinking more in terms of industrial advances in shipbuilding or steelmaking or steam or internal combustion, the kinds of things that saw the kind of Moore’s “Law” progression over roughly 50 year blocks that we’ve seen with photolithographic circuit-building. And yes, Moore’s rule-of-thumb will run out in chipbuilding once our economies aim themselves at the next field of endeavor.

I’ve been seeing similar headlines since the early 80s. Maybe it’s true this time?

The early eighties were a time when the development of faster, smaller, cheaper chips was in full swing, and those people writing the headlines didn’t know what they were talking about. Today, we can rest assured that the pundits are 100% informed. {/sarcasm}

Actually, if you saw how chips are made these days, you’d be amazed that they work at all. Moore’s Law has caused some rather exotic tricks to become standard operating procedure. Did you know that the patterned features in a silicon chip are smaller than a tenth of a wavelength of the deep-UV light used to expose the resist?

1 Like

But none of those were anything remotely like a Moore’s Law style progression, though - there wasn’t a doubling in power/halving of cost every 50 (or however many) years. You had specific breakthroughs that allowed for new inventions that provided significant improvements (the dissemination of which was also limited by patents), and modest improvements via better techniques, tools and materials, etc., which meant all improvements occurred unpredictably. You had brief periods of rapid change and long periods of little to no change. Moore’s Law is unique and has been possible for a unique reason, the specific technology used to fabricate chips. If the means by which chips are improved is no longer based on being able to shrink transistors via incremental improvements to photolithographic processes (or something analogous to that), then I don’t see how you can get Moore’s Law.

The difference is that in the previous instances, there were some impediments in the manufacturing process which were solved; now the problem is that the components of the chip simply can’t physically get any smaller.

I think we’re all wrong, the rate of technological advancement is still increasing, not just that we’re still advancing, but the rate itself continues to increase, hugely disruptive new technologies are still in their infancy. Yes 3D printing, and it’s impossible to fully understand how those will affect the cost of any given thing. Most of the predictions about useful applications for 3D printing I see seem to be completely silly, and based on no understanding of manufacturing, just as early predictions about the utility and impact of the explosion of information technology were also widely off the mark. Or in cases where they were on the mark about specific impacts ignored other, much more important effects.

1 Like

I think you may be confusing the developments within technologies as they grow and mature over the decades and the shifts between mature tech and the results from new fields (steam piston to internal combustion/steam turbine, for example, two fields which had a long ~five-decade push of bleeding-edge relevance after the long ~five or six decade bleeding edge of reciprocating steam).

When density isn’t at the forefront of advance, the next field of development will be in how these supercomputer bits will be put together into new computing devices, or how those devices are in turn used to enable technology we don’t think about very much, yet.

One problem is that there are or were many popular variants of Moore’s Law that seemed to say more or less the same thing at one time but didn’t hold up in the long run. For example you don’t hear much about “Moore’s law” expressed in terms of clock speed anymore.

2 Likes

Even if so, Moore’s Law is uniquely descriptive of computing power, and even then only for the period of computing based on ICs. Most industries are seeing some other form of progression: linear progress or even diminishing returns over time (regardless of what new area of research they’re jumping into).

This topic was automatically closed after 5 days. New replies are no longer allowed.