The trick is likely to be (for these higher-power modes) that some kind of active negotiation will be required: 100 watts at 5 volts is closer to a bus bar than a 'cable', by consumer standards, and anything that doesn't at least start at 5 volts, less than an amp, is going to blow the magic smoke out of badly made cheapo devices (and a shameful number of expensive; but still somehow non-compliant devices) from the period when 'eh, 5v, maybe an amp at the outside, polyfuse will go if I do anything really dumb' was a valid assumption.
We aren't talking about serious computation here, the charge management controllers that keep Li-Ion packs from indulging their tendencies to martyrdom could probably handle the function for an extra nickle (You do trust your vendor's shitty firmware? Don't you consumer, don't you?), but some of the very low tech hacks that work just fine for making USB power cables probably won't cut it.
(On a more serious note, the proposal to bring higher power to USB, outside of limited cases like IBM et al's "Powered USB" which is a nasty, patent encumbered(for what brilliant innovations is somewhat unclear), and frankly rather clunky pseudo-standard; but is also safe from consumer confusion because it's confined almost exclusively to point of sale systems and their peripherals, without an expectation of universal compatibility.
100 watts is, by the standards of little gadgets, a lot of power. Most laptops these days have bricks rated for less than that (a few DTRs excepted), and that is supposed to be enough headroom to run and charge at the same time. Desktop PSUs tend to be a bit chunkier; but still only a few hundred watts, with varying levels of headroom. Smaller devices, routers, set top boxes, all the assorted consumer-electronics-with-USB, might be powered from a little wall wart only good for a low power USB port or two, under load.
So, there's the trouble: to have a marketplace of devices that actually exploit the new high-power capabilities, you need to have potential buyers with high power ports. But adding a high power port to your design could easily involve doubling the bulk of the power brick(and that's for only a single port) and seriously buffing the traces. If you want the user to be able to plug in anywhere, not just the special-magic-high-power-port, things get more complicated and/or even bulkier.
Situations where a device is partially capable of what it claims to be capable of (and this isn't theoretical, USB device descriptors are rife with lies: bus powered hubs that claim to be self powered, all kinds of... creative... behavior) or where a peripheral's load changes based on activity(say a monitor that needs more power when brighter, or a laser printer that needs to heat its fuser from time to time), could easily lead to situations where the hapless user is faced with periodic brownouts, or manually tallying up power draw to stay under a device's maximum across all ports.
I can see the appeal for a few, specific, niche applications, mostly where the device and the peripheral are purpose-built to work with one another and wiring is supposed to be kept to a minimum(though just gluing a 24 or 48V polarized connector to a USB cable and calling it a day sounds pretty attractive under those circumstances); but for ubiquitious use, don't promise what you can't deliver is always a good thing to keep in mind, and a 100watt target is almost certain to be widely and confusingly undeliverable. It'll be a mess.