I would like to implement a high precision delay on Linux but it doesn't work as I want.
Here is 2 examples : one with nanosleep() function and the second one with clock_gettime()
Code: Select all
#CLOCK_MONOTONIC = 1
Structure timespec
Second.i
Nanosecond.i
EndStructure
ImportC "-lrt"
clock_gettime(ClockID.l, *TimeSpecA.TimeSpec)
EndImport
Structure Instance
StartCount.TimeSpec
EndCount.TimeSpec
EndStructure
Global Instance.Instance
Instance\StartCount\Second = 0
Instance\StartCount\NanoSecond = 0
Instance\EndCount\Second = 0
Instance\EndCount\NanoSecond = 0
Procedure DelayMicroSeconds(Time.q)
clock_gettime(#CLOCK_MONOTONIC, @Instance\StartCount)
Repeat
Delay(0)
clock_gettime(#CLOCK_MONOTONIC, @Instance\EndCount)
Until (Instance\EndCount\NanoSecond / 1000 - Instance\StartCount\NanoSecond / 1000) > Time
EndProcedure
a = ElapsedMilliseconds()
For n = 1 To 1000
DelayMicroSeconds(1)
Next
elapsedtime = ElapsedMilliseconds()-a
MessageRequester("debug", Str(elapsedtime) + "millisec") ; result should be 1 ms
Code: Select all
Structure timespec
Second.i
Nanosecond.i
EndStructure
Procedure NanoDelay(Nanoseconds.i)
Protected pause.timespec
If Nanoseconds > 0 And Nanoseconds <= 999999999
pause.timespec\Nanosecond = Nanoseconds
ProcedureReturn nanosleep_(@pause, #Null)
Else
ProcedureReturn -1
EndIf
EndProcedure
a = ElapsedMilliseconds()
For n = 1 To 1000
NanoDelay(1000) ; 1 microsec
Next
elapsedtime = ElapsedMilliseconds()-a
Debug Str(elapsedtime) + " millisec"
MessageRequester("debug", Str(elapsedtime) + " millisec") ; result should be 1 ms
What am I doing wrong ?
Thanks for your time.