Sorry, but my old-fashioned Roku can handle 1080P just fine. 4K TV is not quite affordable yet, and 4K content does not really exist, so why do you need more processing power right now?
If we limit ourselves to talking about the visual content of 1080p, there are perceptible differences between what streaming services offers, what video on demand offers,what bluray offers, and what is offered in the theater as "2k" video. Usually, the compromises are made in scenes with a lot of fast moving action.
The latest video codec, h.265 promises better compression than the current state of the art-- meaning that fewer compromises need to be made to compress the video into 4Mb/sec, or 10 Mb/s, or whatever is the current streaming standard for "HD" movies. Unfortunately, H.265 requires a lot of video processing power.
But ultimately, decompression of H.265 will be a trivial task, offloaded off to some ASIC, eliminating the demand for a "4 core" processor.
The real issue is games. Titanfall, for instance, renders at only 792p on the Xbox1-- not 1080p. It's upscaled to that resolution. And yet, the Xbox One has a pretty hefty processor.
The Xbox 360 was also able to run some games at 1080p-- if 1080p was all that mattered, the XBox one and the ps4 wouldn't be selling. Gamers are buying the new machines not for the sake of 1080p games, but because the new machines are able to render far more complex graphics within that same 1080p frame buffer.
The Fire TV won't be at that level. But the extra cores may help it run more complex games with more involving graphics than a 2 core box would.