Code that heats CPU more than other code that do same thing

Everything else that doesn't fall into one of the other PB categories.
Blade
Enthusiast
Enthusiast
Posts: 362
Joined: Wed Aug 06, 2003 2:49 pm
Location: Venice - Italy, Japan when possible.
Contact:

Post by Blade »

The_Pharao wrote:there is no reason to limit the frame rate in a fullscreen game, IMHO.
more frames -> more cpu usage -> more heat.... ok.
but also means -> smoot movement

everybody who plays first person shooters knows there's a BIIIIIG difference between 30fps, 60fps or 120fps!
:wink:

Wrong :D
If the actual screen mode is at 75hz (75 updates every second) and the game runs at 150fps , your eyes will see just 75 updates, and the other 75 will be totally useless. Calculated, but not shown. (and the CPU will burn)
TheBeck
User
User
Posts: 39
Joined: Mon May 12, 2003 6:04 am
Location: the far west
Contact:

Post by TheBeck »

Blade wrote:Wrong :D
If the actual screen mode is at 75hz (75 updates every second) and the game runs at 150fps , your eyes will see just 75 updates, and the other 75 will be totally useless. Calculated, but not shown. (and the CPU will burn)
Not wrong! Say the frame rate is 150fps and the refresh rate is 75fps, the latency is effectively cut in half! because the information being displayed on the screen is 1/2 as old as if the game was running at 75fps. It's the latency that is important.
Nathan Beckstrand -- XPSP2, AMD Athlon XP 3000+, GF2 GTS, 512MB RAM
Blade
Enthusiast
Enthusiast
Posts: 362
Joined: Wed Aug 06, 2003 2:49 pm
Location: Venice - Italy, Japan when possible.
Contact:

Post by Blade »

Sorry, could be right about latency... But this could be mean that your hand is faster than your eyes? :wink:

I know how a 3D engine works (have done one in C four years ago) and I agree the the "actions" are taken after the "rendering" because the code have to consider the amount of time passed from the previous "rendering".
In other words everythink is delayed of "n" milliseconds.

I still thing that if this delay is shorter than a vertical blank, the latency is no more an issue...
dmoc
Enthusiast
Enthusiast
Posts: 739
Joined: Sat Apr 26, 2003 12:40 am

Post by dmoc »

IIRC this and similar problems arise from the fact that a vsync interrupt and/or access to a register to detect the vsync was never included in any standard agreed by gfx card manufacturers. In fact, unless things have changed since I researched this a few years back, you cannot/ should not rely on vsync for anything because of onboard double/triple buffering. The race for speed and functionality on gfx cards has kept manufacturers from providing this functionality and instead places the emphasis on API's (dx/ogl). What it comes down to is this: if you want your app to run across many different platforms avoid hw dependence. Early game makers soon realised this. That said, I still think there is a need for a high level app to detect display timing so it can optimise itself on-the-fly and avoid hogging the cpu while doing so.
Post Reply