1024x768x32 @60 Hz
1024x768x32 @60 Hz
Can I get 1024x768x32 @60 Hz full-screen multy-layer scrolling with loadsa large sprites moving around at the same time in PureBasic? On a top-notch PC? (I'm talking strictly 2D here.)
Is there any demo or game done in PB to show this off?
Thanks for any replies.
Is there any demo or game done in PB to show this off?
Thanks for any replies.
Re: 1024x768x32 @60 Hz
I am working on a game that does this, with a GeForce FX and a 3.08 Mhz intelfmcpma wrote:Can I get 1024x768x32 @60 Hz full-screen multy-layer scrolling with loadsa large sprites moving around at the same time in PureBasic? On a top-notch PC? (I'm talking strictly 2D here.)
Is there any demo or game done in PB to show this off?
Thanks for any replies.
More tips can be found @ viewtopic.php?t=13495
Televisions and CRT monitors use exactly the same technology to display their images, except for one big point: All TV standards - NTSC, PAL and SECAM (including most HDTV formats) are interlaced.
I won't go into the whole definition of interlacing, but suffice it to say that that half of the horizontal lines are drawn on one pass (line 1,3,5,7, etc) and the other half are drawn on the next pass (2,4,6,8, etc). Each pass takes 1\50th or 1\60th of a second to complete, depending on which television standard you're using. But the entire screen is only updated at 1\30th or 1\25th of a second. If you look at a static image on a tv, especially one with small straight horizontal lines, you'll see the flicker very easily. However, most of the time the flicker is hidden by the fact that the screen is usually in a state of constant change.
Now, that being said, it's important to note that different people have different tolerances. This includes color, sound and flicker perception. Personally, I see no difference whatsoever between 1280 x 1024 @ 60hz and 1280 x 1024 @85hz.
So, the moral of this story? Write your code so that people can decide what frequency to run the game at or, better yet, get that info from the current display driver settings (probably safer).
Russell
I won't go into the whole definition of interlacing, but suffice it to say that that half of the horizontal lines are drawn on one pass (line 1,3,5,7, etc) and the other half are drawn on the next pass (2,4,6,8, etc). Each pass takes 1\50th or 1\60th of a second to complete, depending on which television standard you're using. But the entire screen is only updated at 1\30th or 1\25th of a second. If you look at a static image on a tv, especially one with small straight horizontal lines, you'll see the flicker very easily. However, most of the time the flicker is hidden by the fact that the screen is usually in a state of constant change.
Now, that being said, it's important to note that different people have different tolerances. This includes color, sound and flicker perception. Personally, I see no difference whatsoever between 1280 x 1024 @ 60hz and 1280 x 1024 @85hz.
So, the moral of this story? Write your code so that people can decide what frequency to run the game at or, better yet, get that info from the current display driver settings (probably safer).
Russell
*** Diapers and politicians need to be changed...for the same reason! ***
*** Make every vote equal: Abolish the Electoral College ***
*** www.au.org ***
*** Make every vote equal: Abolish the Electoral College ***
*** www.au.org ***
yeah take it from default settings.. [or put it under advanced, and warn users..]
another thing: If they dont know what they are messing with and they take an too high refresh rate, then not all lcd/tft monitors can show it. So if you have these kinds of options, better let the user be able to change that outside the game, so he dont have to reinstall it just to change it..
another thing: If they dont know what they are messing with and they take an too high refresh rate, then not all lcd/tft monitors can show it. So if you have these kinds of options, better let the user be able to change that outside the game, so he dont have to reinstall it just to change it..




