Intel accused of commissioning rigged CPU benchmarks

Originally published at: https://boingboing.net/2018/10/10/intel-accused-of-commissioning.html

6 Likes

uh. Maybe I always had this wrong…but while the chipset itself is certainly not UNimportant in gaming, isn’t it the GPU and subsequent dedicated GPU ram that is far more important to gaming?

i.e.: my understanding has been that you’re much better off with a lower grade intel i5 or AMD Ryzen 5 chip and a top of the line Nvidia 1080ti than having an intel i7 or ryzen 7 and a lower end Nvidia 1050/1060

2 Likes

The AMD Master software has a ‘game mode’ for their Threadripper chip which disables half of the cores. When improperly enabled on the 8 core Ryzen chip it disables half of the cores, and that’s may be worst of their settings.

quori, you are correct that typically GPU matters more than CPU, unless you run at lower resolutions and quality settings, in which case the GPU is used less and bottleneck moves to the CPU, and this is one of the other things that was done.

The thing is, in real usage, the new intel chip is likely going to be on par or up to 10% faster in games than the AMD Ryzen chips, but the AMD chips are something like half the price, and beat the Intel chips at higher thread count applications like media encoding.

4 Likes

It is swinging back to the CPU being the bottleneck for some games especially with DX12. However, this is only going to be an issue if you have a good GPU.

GPU’s have just progressed much, much faster than CPUs with respect to gaming applications.

3 Likes

I’m not savvy with electronics, despite me having built my gaming PC but yeah the GPU and good RAM are pretty key to good benchmarks. I can run anything at maxed out settings with my i5, however i would like to upgrade to an i7 when i have the money saved up though higher on my to-buy list is a really good gaming monitor with high refresh rate.

1 Like

“As for the press, when we honor embargos after finding another source for the news or finding out that it’s bullshit, it’s not really an embargo: it’s just an NDA, and we’re doing PR work for free.”

This is ironic shade from that noted purveyor of journalistic and scientific accuracy and seller of charcoal tooth whitener.

I’ve cut my gaming way back as my iMac is now 8 years old and simply has become deprecated for most things. Even in cases where it is still technically supported, the settings have to be at their lowest for anything to be playable. I have no complaints about getting 8 years out of her to be perfectly honest…except for the continued gripe that Apple doesn’t make configurable/upgradeable hardware…but that is another argument entirely.

So I have been watching prices and considering buying a pre built small form factor rig like the Chronos from Origin, Trident from MSI, or Alienware Alpha VS. building a small form factor rig with an NZXT ITX tower and corresponding hardware. Everything I have researched says either build a lower end chipset with the best GPU you can buy…or build the high end chipset that will not need upgrading for quite a while and buy a much cheaper graphics card that you can swap out when you can swing the high end one.

Have a look at https://www.logicalincrements.com/games for the practical recommendations.

ETA: They did express an opinion about how it’s wasting money to buy: i9 and Threadrippers:

Note about i9 and Threadripper CPUs:

While there are more expensive CPUs on the market than any of those listed below, such “higher-tier” chips sacrifice per-core speed in order to provide additional cores in each CPU. This strategy makes those high-core-count CPUs terrific choices for professional-grade video editing, music production, and 3D rendering—but slightly worse choices for gaming, which currently benefits more from high single-core speeds than from high core counts.

4 Likes

My willingness to spend money on new (and not substantially better) tech has decreased rapidly in the last 8 or so years.

My gut instinct has been newest core i5, really good motherboard. some decent ram (16gb) and a 1050 ti gpu will serve my needs. I can upgrade the GPU card down the road when I have the need to. And this will keep the up front cost relatively low and allow some splurges that will be long lasting (like a Kraken liquid cpu cooler)

2 Likes

The cost/performance pendulum swing over the decades between Intel and AMD has been interesting. I’ve always opted for AMD when it’s been marginal performance difference at vast cost savings. Intel chips are currently overpriced for what they are. I’d like to see a third chipmaker step up for lower end machines, the way Cyrix did in the 90’s.

That being said, my 8 year old laptop is more than sufficient for purpose and will likely be so for the immediate future.

Oh, and my kid is building his gaming rig with a Ryzen 5, and plowing the rest of the money he’s saving into the GPU.

1 Like

Newspapers have derived most of their revenue from ads for many decades now. This is only a problem when the paper runs an ad disguised as a journalistic article.

For me, the biggest takeaway is knowing what Intel considered a good CPU cooler.

1 Like

The difference here is that boing boing is directly selling hokum, not running ads for hokum.

Just build whichever fits your budget man.

https://www.reddit.com/r/pcmasterrace/wiki/builds?v=88997ece-cdaf-11e4-bd95-22000b339303

I have no regrets going with AMD for my most recent builds. I had a few Intel builds, but the Intel-branded motherboards I had (except for a Mini-ITX Atom board still chugging away) consistently failed to Capacitor Plague. No wonder they got out of the consumer motherboard business.

But even that Atom board has a big problem: a buggy-as-fuck ACPI implementation with no fix forthcoming, that rears its ugly head any time the system gets a heavy load (such as recompiling Asterisk). It was not a problem with Linux 2.6 kernels (e.g. CentOS 6), but newer kernels start pegging one thread when the CPU reaches 72°C. I guess I need to find one of those stick-on heat sinks.

Which is a standard approach for benchmarking CPUs. If you’re attempting to establish how a CPU handles gaming. You want to, as much as you can, ensure that its the CPU that will be bottle necked. Otherwise you’re really checking the strength of the GPU, and with identical set ups it’ll minimize the differences between the Processors. I’ve even seen it argued that CPU benchmarks should be run at 720p.

I’m still on bulldozer the “its shit” AMD CPU from years past. I haven’t had trouble running things at 1080, 60fps (or near enough) at high to ultra. The GPU is an rx 480. If your concern is games that look good and run smooth rather than “most FPS!” and bragging rights. It doesn’t take all that much to accomplish.

More cores tends to come with lower clocks. But the gaming use case for the threadrippers (and to a certain extent the 8+ non HEDT chips) is in multi tasking. You get the full number of cores a game will scale to for the game, and there are still more leftover to run other tasks in the background. Its big with streamers.

1 Like

Way to screw up, Intel! Everyone already knew your CPUs were better for gaming at high frame rates than AMD ones, tacking the release of this redundant report on to the end of an otherwise (and unexpectedly) smooth launch event was unnecessary and the skew and incompetence of the methodology have turned a non-entity into a massive steaming pile of elephant dung that now needs to be cleaned up.

Insert slow hand clap here.

As opposed to rigging it themselves like they’ve done in the past?

sounds like you should split the difference and get a midrange setup.

1 Like

2 Likes