Please use v2 of the Readspeed tool instead.
http://www.purebasic.fr/english/viewtopic.php?t=37518
It is overly simplified, no writing of results to disk, it's all in memory.
I've stripped out as much code as possible to make this really tight.
It's also way simpler and less confusing to use. (I hope).
Anyway, fool around with v2 instead, simple yes but way better.
This "tool" was born from the following thread: http://www.purebasic.fr/english/viewtopic.php?t=37438
Curious which chunksize is fastest?,
and indirectly what the best filebuffersize might be?,
or want to compare Hard Drive?
or which OS or Operating Filesystem has best speeds?
Then this tool might be of interest!
Make sure you compile the tool with threadsafe enabled, and without the debugger. Or just create and executable.
The source should be cross platform, although on Windows the multimedia API is used to help the measurement accuracy, Mac and Linux "should" have 1ms accuracy as far as I know.
When you run the test tool it will pop up a savefile requester,
choose where and what name it should have, if you leave out the .csv it will be automatically added.
The CSV file is a "standard" CSV file, or as close as one can get, I choose the "standard" that Wikipedia describe as the most common CSV filetype.
Next simply select 1 or more files to test, a progress window will pop up showing, er... progress. When done a select file(s) to test requester appears again, if you choose to test more files the results will be added to the current test run. To quit choose cancel.
You can also abort the whole test while the progress window is up by hitting the closebutton on it.
Note! An existing CSV file will be overwritten by the savefile requester.
There are 4 columns of data stored in the savefile,
the latter two are usefull on the test system only.
However the two first columns, the chunksize and the normalized speed is not tied to the test system and can be used to compare across HD's, OS and other machines.
The normalized speed column is normalized with the 1MB chunksize as the base line.
Any values that are smaller means they are slower/took longer,
any values above 100 means they are faster/took less time.
For example, a speed result of 8 for the 512 byte chunksize test means it was 92% slower than the 1MB test (eek).
Please post your results "as is" in this thread, put them in a code block.
This way I (and anyone else interested) can simply copy and paste the results into CSV files, load them into a spread sheet program (like OpenOffice.org Scalc.) and merge the test result files, and apply some nice spreadsheet magic to get overall stats of the results, or maybe make a PHP script, unless you want to roll your own result analysis tool in PureBasic that is
The source is Public Domain, although I request that if you intend to post results in this thread that the way the speed measurement done and CSV format used is not changed, this is just to avoid the pain of mixed filetypes and different ways to measure things, and make it easy to merge test results into spreadsheets or otherwise analyze it.
As usual with "my" code I do some odd or interesting things,
like the independent progressmeter window that is in it's own thread.
You could easily just remove the Main_Window() and any references to the procedure and the test tool would would still work without any issues.
It's loosely commented, but hopefully still easy to see what is going on.
Hopefully no silly bug/mistakes snuck in, it's not the prettiest code but should be fast and stable "knock on wood" despite a lack of extensive testing.
The progress meter will jump around erratically (or seem to), this is due to the fact it's in a independent "passive" thread, it updates the progressbar at 50fps, don't be surprised if small files barely register on it though.
My advise is to test with files around the 500MB size, too small files barely give enough time difference for conclusive results, and really large files, though interesting to test just takes too damn long. *laughs*
Also note that smaller chunksizes and smaller files use more CPU than large chunksizes and large files. (more on this in my next post below)
Have fun, and let's get lots of test results, just compiler/run (without debugger) and threadsafe on, and copy'n'paste the results in this thread, stats/speed tests are fun
Code: Select all
;Test Readspeed v1.0, Public Domain.
DisableDebugger ;Even this is here I still advise compiling without the debugger.
EnableExplicit ;Why isn't this default yet Fred? :P
;And don't forget to compile this with threadsafe on folks!
#File1=1
#File2=2
#Window1=1
#Gadget1=1
#Gadget2=2
#Gadget3=3
Global filenum.i,filesize.q,filepos.q,chunksize.i,quitprogram.i
Procedure.i Window_Main(create.i=#False)
 Static quit.i
 Protected event.i,result=#False,filepercent.d,start.l,stop.l,pos.d,size.d,num.i
 If create
  quit=#False
  If OpenWindow(#Window1,0,0,200,60,"Test Readspeed",#PB_Window_TitleBar|#PB_Window_ScreenCentered|#PB_Window_SystemMenu)
   result=#True
   ProgressBarGadget(#Gadget1,0,0,200,20,0,100)
   TextGadget(#Gadget2,0,20,200,20,"File: ")
   TextGadget(#Gadget3,0,40,200,20,"Chunksize: ")
   If Not IsGadget(#Gadget1) Or Not IsGadget(#Gadget2) Or Not IsGadget(#Gadget3)
    result=#False
   EndIf
   If result
    CompilerIf #PB_Compiler_OS=#PB_OS_Windows
     start=timeGetTime_()
    CompilerElse
     start=ElapsedMilliseconds()
    CompilerEndIf
    stop=start
    Repeat
     event=WaitWindowEvent(1)
     CompilerIf #PB_Compiler_OS=#PB_OS_Windows
      stop=timeGetTime_()
     CompilerElse
      stop=ElapsedMilliseconds()
     CompilerEndIf
     If (stop-start)>=20
      CompilerIf #PB_Compiler_OS=#PB_OS_Windows
       start=timeGetTime_()
      CompilerElse
       start=ElapsedMilliseconds()
      CompilerEndIf
      num=filenum
      pos=filepos
      size=filesize
      If num And pos And size
       filepercent=(pos/size)*100.0
       SetGadgetState(#Gadget1,Int(filepercent))
       SetGadgetText(#Gadget2,"File: "+Str(filenum))
       SetGadgetText(#Gadget3,"Chunksize: "+Str(chunksize))
      EndIf
     EndIf
     If event=#PB_Event_CloseWindow
      quit=#True
      quitprogram=#True
     EndIf
    Until quit
   EndIf
   CloseWindow(#Window1)
  Else
   result=#False
  EndIf
 Else
  quit=#True
  result=#True
 EndIf
 ProcedureReturn result
EndProcedure
CompilerIf #PB_Compiler_OS=#PB_OS_Windows
 timeBeginPeriod_(1)
CompilerEndIf
Define *mem,file$,text$,savefile$,i.i,start.l,stop.l,readlen.i,thread1.i,base.d,time.d,percentage.d
*mem=AllocateMemory(1048576) ;1MB, 1024*1024 KB, 1048576 bytes, also our largest/starting chunksize.
If *mem
 savefile$=SaveFileRequester("Choose savefile for results:","","Comma Seperated Values (CSV)|.csv",0)
 If savefile$
  If ".csv"<>LCase(Right(savefile$,4))
   savefile$+".csv"
  EndIf
  If CreateFile(#File2,savefile$)
   text$=#DQUOTE$+"Chunksize"+#DQUOTE$+","+#DQUOTE$+"Normalized speed (100%=1MB speed)"+#DQUOTE$+","+#DQUOTE$+"Native speed (ms)"+#DQUOTE$+","+#DQUOTE$+"Native size (Bytes)"+#DQUOTE$+#CRLF$
   WriteString(#File2,text$)
   Repeat
    filenum=0 ;reset filecounter for main loop.
    If IsThread(thread1)
     WaitThread(thread1)
    EndIf
    thread1=CreateThread(@Window_Main(),#True)
    chunksize=0
    filesize=0
    filepos=0
  
    file$=OpenFileRequester("Choose file(s) for readspeed test:",file$,"All (*.*)|*.*",0,#PB_Requester_MultiSelection)
    While file$
     If ReadFile(#File1,file$)
      filenum+1 ;File counter for main/outermost loop check.
      filesize=Lof(#File1)
      FileBuffersSize(#File1,0) ;We're gonna test readspeed and optimal buffer size with ReadData,
      ;so it would only complicate the results if we used filebuffers in this case as well.
      text$=""
      chunksize=MemorySize(*mem)
      Repeat
       CompilerIf #PB_Compiler_OS=#PB_OS_Windows
        start=timeGetTime_()
       CompilerElse
        start=ElapsedMilliseconds()
       CompilerEndIf
       FileSeek(#File1,0)
       filepos=0
       While Eof(#File1)=#False
        readlen=ReadData(#File1,*mem,chunksize)
        filepos+readlen
        If quitprogram
         Break
        EndIf
       Wend
       CompilerIf #PB_Compiler_OS=#PB_OS_Windows
        stop=timeGetTime_()
       CompilerElse
        stop=ElapsedMilliseconds()
       CompilerEndIf
       If Not quitprogram
        time=stop-start
        If chunksize=MemorySize(*mem)
         base=time
        EndIf
        If base=0 Or time=0
         percentage=0.0
        Else
         percentage=(base/time)*100.0
        EndIf
        text$+Str(chunksize)+","+Str(Int(percentage))+","+Str(stop-start)+","+Str(filesize)+#CRLF$
        chunksize>>1 ;chunksize/2, our next chunksize to test.
        If chunksize<512 ;The smallest chunksize we'll test. (512 is NTFS blocksize if I recall correctly)
         ;the smallest value you can use is 0, which means test all chunksizes including 1 byte.
         chunksize=0 ;You could also use a Break instead here, I prefer to run the full loop though.
        EndIf
       Else
        chunksize=0
       EndIf
      Until chunksize=0 ;Make sure the loop ends, we could also use <1 but not sure if that's a faster check.
      text$+#CRLF$
      WriteString(#File2,text$)
      CloseFile(#File1)
      file$=NextSelectedFileName()
     Else
      MessageRequester("Test Readspeed","Unable to open file(s) for reading(!)")
      file$=""
     EndIf
     If quitprogram
      filenum=0
     EndIf
    Wend
  
    Window_Main(#False)
    If IsThread(thread1)
     WaitThread(thread1)
    EndIf
   Until filenum=0 ;If cancel was chosen in OpenFileRequester or no files, we'll quit the program.
   CloseFile(#File2)
  Else
   MessageRequester("Test Readspeed","Unable to create results file(!)")
  EndIf
 Else
  ;Cancel chosen in savefile requester.
 EndIf
 FreeMemory(*mem) : *mem=#Null
Else
 MessageRequester("Test Readspeed","Unable to allocate 1MB memory(!)")
EndIf
CompilerIf #PB_Compiler_OS=#PB_OS_Windows
 timeEndPeriod_(1)
CompilerEndIf

