Page 1 of 1

cross platform game timing

Posted: Tue Jul 17, 2007 8:15 am
by god64
i am trying to limit my game to 30 fps, so i do:

SetFrameRate(30)

and then in the mainloop i do a

FlipBuffers(1)

It seems to work fine on windows, but on linux it runs about twice as fast. What am i doing wrong, or is there some better way to time the gamespeed?

Is it possible to call a drawing function every n milliseconds by some kind of timer event?

Posted: Tue Jul 17, 2007 8:54 am
by Kaeru Gaman
one possibility is to use threads.

theoretically spoken:

one thread does nothing but displaying the grafics on the screen and flipping the buffers a.f.a.p.
the positions of the objects and world details are read from the memory.
(I thing there was some problem with accessing screen stuff from a sub thread, so this should be done by the main process)

the other thread is timer controlled and reads the imput and moves the objects,
means changes the values within the memory the displaying threads reads from.

as a result, the action of the game will be properly timed,
the framerate of the display is independent from that.


...btw: are you in the german forums, too?
http://www.purebasic.fr/german/

Re: cross platform game timing

Posted: Tue Jul 17, 2007 11:09 am
by Psychophanta
god64 wrote:Is it possible to call a drawing function every n milliseconds by some kind of timer event?
Well, i think the simplest, easiest and functional way is just to add a Delay(number_of_milliseconds) just after the FlipBuffers(1) function. The problem is that while the program is "Delayed" (using Delay() function), nothing is done. Using threads, as Kaeru mentions, could be a solution if the requirements of your program demands to be not slept at every frame.

Posted: Wed Jul 18, 2007 6:44 am
by god64
i tried several ways now to time the program exact on windows and linux. the only one working so far is the following:

do NOT use SetFrameRate(), it messes all up

Code: Select all

delayValue=1000/25 ; 25 fps
repeat ; game main loop
	startTime=ElapsedMilliseconds()
	; do drawing stuff
	FlipBuffers(0)
	; do logic stuff
	tmp=ElapsedMilliseconds()-startTime
	If (tmp<delayValue)
		Delay(delayValue-tmp) ; yikes, edited this line, was wrong first time
	EndIf
until exit=1
if someone has a better method, let me know. i am not happy with software using delay for timing, but for now it works, runs exactly the same speed on linux and windows

Posted: Wed Jul 18, 2007 7:05 am
by PB
> for now it works, runs exactly the same speed on linux and windows

If it works, then that's all that matters. :) I used to worry about "hack" code
myself in the past, to the point where I'd spend forever trying to do some
routine "properly". But one day I realised that my app was working fine all
along, and doing what I wanted with no issues, so why was I wasting my
valuable time trying to fix something that wasn't actually broken? :)