GPU boosting is a legal cheating

This actually hit me recently after announcement of GeForce GTX 1080 and its crazy clock as well as throttling issues reference (Founders) models are facing…

GPU boost (I’m talking about boosting in general, that’s why not a capital B which is NVIDIA’s trademark) in an essence takes the temperature and power headroom and applies it to the GPU clock. The cooler the card is, the higher it’ll clock itself. There is also power utilization factor which is more hardcoded than the temperature limit which can be overcome by a better cooling or colder ambient temperature. Power limit is the total power consumed by GPU and can only be overcome by decreasing the clock or increasing the power limit. Once it reaches the limit, it’ll downclock to meet the set power limit.

As a consumer, you might look at GPU boost as something wonderful. You get higher clock for free without having to do any kind of manual overclocking. Great, right? Well, not so much…

The thing is, GPU boosting only really works in short bursts, in the beginning of the induced load. As the card and computer case internal air becomes hotter, graphic cards WILL downclock. In case of reference GTX 1080, even so much it goes all the way down to a base clock. Meaning the boost headroom disappears entirely during real world gaming.

What does this mean you might wonder and why the hell do I consider GPU boosting as legal cheating?


When you’re gaming, you’re usually doing that for longer periods of time. Hours even. And since it’s an actual game, it truly stresses entire system, increasing temperatures to higher levels as a whole. Especially since air inside case will heat up quite a bit, impairing cooling efficiency of the entire system. What happens in such cases is that graphic cards will start to downclock and you’ll be losing the boost clock. Losing boost clock means you’ll be losing framerate. And with that, you won’t be really getting advertised performance.


Benchmarks, while they are intensive, they usually do not last for long periods of time. And since they aren’t actual games with real unpredictable physics and Ai calculations that create load on components other than GPU, you’ll see smaller temperature spikes, lower overall temperature loads and thus higher boost clocks and higher framerate, resulting in higher score in benchmarks.

Why is this bad?

It’s bad because it doesn’t show the real performance. In benchmarks, it’ll clock high as it’ll never heat up enough to really be affected by temperature. And even if it does get at some point, it’ll already gain advantage on the score which is usually directly calculated from the total number of rendered frames in a given fixed time (usually the duration of the entire benchmark). Great for reviews. But when you actually buy the card and stress it with actual game for longer periods of time, you won’t actually be getting the advertised performance because the card will downclock. It just will happen. The situation becomes even worse when you realize graphic cards cool down slightly even between benchmarks. Just load up 3DMark and listen to your graphic card fan. It’ll spin down between tests while system is loading benchmark data, meaning graphic card had some time to cool down and decrease fan speed. Higher boost again, even if only in a short surge! And when hardware reviewers switch between tests, it again has the chance to cool down. Meaning it’ll always have pretty much fresh headroom on every benchmark start, resulting in skewed results.


One of solution would be to fire up a modern game, play it and observe the normal gaming temperature after an hour of gaming. Then, before you benchmark, preheat the GPU to an actual working temperature. Either by running something intensive right up to a point when you fire up benchmark or by artificially heating it up using external heater.

Other, easier method would be to fire up a modern game, play it for an hour and observe where GPU clock ends up at that point. Then run benchmarks using that fixed GPU frequency. This way you’re getting benchmark results at realistic clocks which will also be experienced by actual consumers and not some GPU boosted clocks that surge high in the beginning and then basically disappear.

Now, I’m not some big authority that can dictate testers how to do their benchmarking and I don’t do hardware benchmarks myself for the public. But I am a consumer and this certainly bothers me a bit, especially since it doesn’t really benefit gamers/consumers as much as it benefits the makers of graphic cards…

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s