Challenge: Fastest Way To Parse And Count Words?

Just starting out? Need help? Post your questions and find answers here.
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by fweil.

Thanks Pupil ... I am checking now your code, but the first difference in counts is probably caused by the 'valid chars' we accept.

Using the chr(39) and A-Z a-z range as you do, I find 792079 words, 14480 unique words on 100117 lines.

Don't know exactly where the differences are then ...

Concerning performances I will look how to use the download of the file to memory, and it is certainly faster to process than reading using ReadString().

Here is my code :

Code: Select all

NLines.l
NWords.l
CurrentDirectory.s
EOL.s

Dim AsciiConv.s(255)
Dim AllWords.s(10000000)
Dim UniqueWords.s(1000000)
Dim WordCount.l(1000000)

Global NLines, NWords, EOL, CurrentDirectory, AsciiConv, Allwords, UniqueWords, WordCount

Procedure.l IMod(a.l, b.l)
  ProcedureReturn a - (b * (a / b))
EndProcedure

Procedure ParseFile(FileName.s)
  Debug "Parsing : " + FileName
  SetGadgetText(100, "Processing file " + FileName)
  CurrentDirectory = GetPathPart(FileName)
  If ReadFile(0, FileName)
      Repeat
        NLines + 1
        a$ = LTrim(RTrim(ReadString()))
        b$ = ""
        For i = 1 To Len(a$)
          b$ = b$ + AsciiConv(Asc(Mid(a$, i, 1)))
        Next
        
        While FindString(b$, "  ", 1)  0
          b$ = ReplaceString(b$, "  ", " ")
        Wend
        
        b$ = LTrim(RTrim(b$))
        If Len(b$)  0
            While FindString(b$, " ", 1)  0
              AllWords(NWords) = Mid(b$, 1, FindString(b$, " ", 1) - 1)
              NWords + 1
              b$ = Mid(b$, FindString(b$, " ", 1) + 1, Len(b$) - FindString(b$, " ", 1) - 1 + 1)
            Wend
            AllWords(NWords) = b$
            NWords + 1
        EndIf
        
        If IMod(NLines, 2500) = 0
            StatusBarText(0, 0, "Parsing line #" + Str(NLines) + " ... found " + Str(NWords) + " words.", 0)
        EndIf
      Until Eof(0)
      CloseFile(0)
      StatusBarText(0, 0, "Parsing line #" + Str(NLines) + " ... found " + Str(NWords) + " words.", 0)
  EndIf
EndProcedure

;
;
;

  Quit.l = #FALSE
  WindowXSize.l = 320
  WindowYSize.l = 240

  CurrentDirectory = Space(255)
  GetCurrentDirectory_(255, @CurrentDirectory)
  
  EOL.s = Chr(13) + Chr(10)
  
  For i = 0 To 255
    If (i >= 'A' And i = 'a' And i  AllWords(i - 1)
                      j + 1
                      UniqueWords(j) = AllWords(i)
                      WordCount(j) = 1
                    Else
                      WordCount(j) + 1
                  EndIf
                Next
                
                NUniqueWords.l = j

                SetGadgetText(100, "File : " + FileName + EOL + "Lines : " + Str(NLines) + EOL + "Words : " + Str(NWords + 1) + EOL + "Unique words : " + Str(NUniqueWords + 1) + EOL + "Done in " + Str(GetTickCount_() - tz) + "ms")
                
                If CreateFile(0, "result.txt")
                    For z = 0 To NUniqueWords
                      WriteStringN(Str(z) + " " + UniqueWords(z) + Chr(9) + Chr(9) + Str(WordCount(z)))
                    Next
                    CloseFile(0)
                EndIf
                ShellExecute_(hWnd,"open","result.txt","","",#SW_SHOWNORMAL)
                
              Case 99
                Quit = #TRUE
            EndSelect
        EndSelect
      Until Quit
      
  EndIf

End
Nice case study ...

Rgrds

Francois Weil
14, rue Douer
F64100 Bayonne
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by Pupil.
Originally posted by fweil

Thanks Pupil ... I am checking now your code, but the first difference in counts is probably caused by the 'valid chars' we accept.

Using the chr(39) and A-Z a-z range as you do, I find 792079 words, 14480 unique words on 100117 lines.

Don't know exactly where the differences are then ...

Concerning performances I will look how to use the download of the file to memory, and it is certainly faster to process than reading using ReadString().
Ok i found that you store the following as two unique words instead of one, example:

hello Hello

If i use the LCase() command on the string that is read, i get the same result with both sources.

BTW it took 17 seconds to parse the King Jame's text file on my computer...
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by Franco.

Francois, just tested your code and looked through the result.txt file.
The discrepance of counting different words may come because your code seems not work properly.
In the result.txt file you will see that you have:
...
5312 GOD 2
5313 Go 12
5314 GOD 52
5315 Go 223
5316 Goal 1
5317 Goath 1
5318 Gob 2
5319 God 74
5320 GOD 246
...

So line 5312, 5314 and 5320 seems to have the same value, also 5313 and 5315....

Have a nice day...

Franco
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by fweil.

Allright some bugs still remain ...

Working on this made me discovering one tricky thing. Depending on where files come from, line separators may be CR, LR, CRLF or LFCR.

I checked Horst's source code for loading a file to memory, and had bugs loading King Jame's file with it. I finally pointed out it was caused by line separators not compatible.

As I am not a good ASM coder, I changed Horst's code to whole Purebasic.

Counting lines with it gives now 75231 lines.

I'll be right back with something faster and accurate.

Here is the Horst's code I updated.

Code: Select all

#FileBufferMem = 0
 
Global MemFileOffset.l, MemFileSize.l  

Procedure.l LoadFileToMem(fileID,fname.s) 
  Protected fileID,fname 
  If ReadFile(fileID,fname)
      MemFileSize = Lof()
      Debug "MemFileSize = " + Str(MemFileSize)
      *FileBuffer = AllocateMemory(#FileBufferMem,MemFileSize,0)
      If *FileBuffer
          ReadData(*FileBuffer,MemFileSize)
      EndIf 
      CloseFile(fileID)
      MemFileOffset = 0 ; reset 
  EndIf 
  ProcedureReturn *FileBuffer
EndProcedure 

Procedure MoreInMem()
  If MemFileOffset = MemFileSize
  EndIf
  Skip = 1
  Byte = PeekB(Start + Length + 1) 
  If Byte = 10 Or Byte = 13
      Length + 1
      Skip + 1
  EndIf
  MemFileOffset + Length
  ProcedureReturn PeekS(Start + 1, Length - Skip)
EndProcedure

Procedure CloseFileMem()
  FreeMemory(#FileBufferMem)
EndProcedure

Procedure example(sourcefile.s)
  If LoadFileToMem(0,sourcefile)
      While MoreInMem()
        Line.s = ReadLineFromMem()
        NLines + 1
        ; do anything with line
      Wend 
      CloseFileMem()
    Else 
      ; error ..
  EndIf 
  Debug NLines
EndProcedure 

;
;
;

  example("king-james.txt")
  
End
Francois Weil
14, rue Douer
F64100 Bayonne
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by fweil.

To Franco firstly ... thank you to have pointed out this issue.

It made me some trouble to nuderstand what was wrong in my code working on such large files and counting words. But I finally placed some instructions to copy the AllWords array before and after SortArray() and noticed that using the option parameter 2 make bugs in sorting the array. This value is for sorting in ascending no case order.

I did not test further to understand what is wrong there, if it is the same using numbers instead of strings, but the SortArray() function is bugged somewhere.

Months ago I had a test which made me think there was such a bug, but could not find a good benchmark for Fred to point it.

Now the benchmark exists.

Well, this challenge to parse and count words is finished for me as I went in using the file to memory concept. And it enhances performances well ! And it solves with my update of Horst's code the issue caused by CR and LF order for lines delimitors.

For closing my code, I finally removed the option 2 for SortArray(), and the program counts words accurately.

My results for King Jame's file are now :

Lines 75231
Words 856121
Unique words 14185
Done in about 10 seconds !!!!!

You know what I am happy !

And for interested people here is my last code. You will notice I not just use Chr(39) A-Z a-z chars as words. Just for personal reasons it is some characters left to be considered as part of words.

Right now I will continue coding from there but will not share the following versions I am thinking to. But I will share other stuff on the forum of course ... :)

Code: Select all

NLines.l
NWords.l
CurrentDirectory.s
EOL.s

Dim AsciiConv.s(255)
Dim AllWords.s(10000000)
Dim UniqueWords.s(1000000)
Dim WordCount.l(1000000)

Global NLines, NWords, EOL, CurrentDirectory, AsciiConv, Allwords, UniqueWords, WordCount

#FileBufferMem = 0
 
Global MemFileOffset.l, MemFileSize.l  

Procedure.l LoadFileToMem(fileID,fname.s) 
  Protected fileID,fname 
  If ReadFile(fileID,fname)
      MemFileSize = Lof()
      Debug "MemFileSize = " + Str(MemFileSize)
      *FileBuffer = AllocateMemory(#FileBufferMem,MemFileSize,0)
      If *FileBuffer
          ReadData(*FileBuffer,MemFileSize)
      EndIf 
      CloseFile(fileID)
      MemFileOffset = 0 ; reset 
  EndIf 
  ProcedureReturn *FileBuffer
EndProcedure 

Procedure MoreInMem()
  If MemFileOffset = MemFileSize
  EndIf
  Skip = 1
  Byte = PeekB(Start + Length + 1) 
  If Byte = 10 Or Byte = 13
      Length + 1
      Skip + 1
  EndIf
  MemFileOffset + Length
  ProcedureReturn PeekS(Start + 1, Length - Skip)
EndProcedure

Procedure CloseFileMem()
  FreeMemory(#FileBufferMem)
EndProcedure

Procedure.l IMod(a.l, b.l)
  ProcedureReturn a - (b * (a / b))
EndProcedure

Procedure ParseFile(FileName.s)
  Debug "Parsing : " + FileName
  SetGadgetText(100, "Processing file " + FileName)
  CurrentDirectory = GetPathPart(FileName)
  If LoadFileToMem(0, FileName)
      While MoreInMem()
        NLines + 1
        a$ = LTrim(RTrim(ReadLineFromMem()))
        b$ = ""
        For i = 1 To Len(a$)
          b$ = b$ + AsciiConv(Asc(Mid(a$, i, 1)))
        Next
        While FindString(b$, "  ", 1)  0
          b$ = ReplaceString(b$, "  ", " ")
        Wend
        b$ = LTrim(RTrim(b$))
        If Len(b$)  0
            While FindString(b$, " ", 1)  0
              AllWords(NWords) = Mid(b$, 1, FindString(b$, " ", 1) - 1)
              NWords + 1
              b$ = Mid(b$, FindString(b$, " ", 1) + 1, Len(b$) - FindString(b$, " ", 1) - 1 + 1)
            Wend
            AllWords(NWords) = b$
            NWords + 1
        EndIf
        If IMod(NLines, 2500) = 0
            StatusBarText(0, 0, "Parsing line #" + Str(NLines) + " ... found " + Str(NWords) + " words.", 0)
        EndIf
      Wend
      StatusBarText(0, 0, "Parsing line #" + Str(NLines) + " ... found " + Str(NWords) + " words.", 0)
  EndIf
EndProcedure

;
;
;

  Quit.l = #FALSE
  WindowXSize.l = 320
  WindowYSize.l = 240

  CurrentDirectory = Space(255)
  GetCurrentDirectory_(255, @CurrentDirectory)
  
  EOL.s = Chr(13) + Chr(10)
  
  For i = 0 To 255
    AsciiConv(i) = Chr(i)
  Next
  
  AsciiConv(Asc(".")) = " "
  AsciiConv(Asc(",")) = " "
  AsciiConv(Asc(":")) = " "
  AsciiConv(Asc(";")) = " "
  AsciiConv(Asc("+")) = " "
  AsciiConv(Asc("-")) = " "
  AsciiConv(Asc("*")) = " "
  AsciiConv(Asc("/")) = " "
  AsciiConv(Asc("(")) = " "
  AsciiConv(Asc(")")) = " "
  AsciiConv(Asc("[")) = " "
  AsciiConv(Asc("]")) = " "
  AsciiConv(Asc("'")) = " "
  AsciiConv(Asc("!")) = " "
  AsciiConv(Asc("?")) = " "
  AsciiConv(Asc("{")) = " "
  AsciiConv(Asc("}")) = " "
  AsciiConv(Asc("=")) = " "
  AsciiConv(Asc("")) = " "
  AsciiConv(Asc(Chr(34))) = " "
  AsciiConv(Asc(Chr(9))) = " "
  
  hWnd.l = OpenWindow(0, 200, 200, WindowXSize, WindowYSize, #PB_Window_SystemMenu | #PB_Window_MinimizeGadget | #PB_Window_MaximizeGadget | #PB_Window_SizeGadget | #PB_Window_TitleBar, "MyWindow")
  If hWnd
      AddKeyboardShortcut(0, #PB_Shortcut_Escape, 99)
      LoadFont(0, "Verdana", 12)
      FontID.l = FontID()
      If CreateMenu(0, WindowID())
        OpenSubMenu("General")
          MenuItem(11, "Open file")
          MenuItem(99, "Quit")
        CloseSubMenu()
      EndIf
      If CreateStatusBar(0, WindowID())
          StatusBarText(0, 0, "Idle ...", 0)
      EndIf
      If CreateGadgetList(WindowID())
          SetGadgetFont(FontID)
          TextGadget(100, 10, 10, WindowXSize - 20, WindowYSize - 40, "")
      EndIf
      SetGadgetText(100, "Select a file to process ...")
      Repeat
        Select WaitWindowEvent()
          Case #PB_EventCloseWindow
            Quit = #TRUE
          Case #PB_EventMenu
            Select EventMenuID()
              Case 11
                FileName.s = OpenFileRequester("Select a file", CurrentDirectory + "\" + "*.txt", "Text files|*.txt|All files|*.*", 0, #PB_Requester_MultiSelection)
                NLines.l = 0
                NWords.l = 0
                tz.l = GetTickCount_()
                ParseFile(FileName)
                NWords - 1
                SetGadgetText(100, "File : " + FileName + EOL + "Lines : " + Str(NLines) + EOL + "Words : " + Str(NWords + 1))

                SortArray(AllWords(), 0, 0, NWords)

                j = 0
                UniqueWords(j) = AllWords(j)
                WordCount(j) = 1
                For i = 1 To NWords
                  If AllWords(i)  AllWords(i - 1)
                      j + 1
                      UniqueWords(j) = AllWords(i)
                      WordCount(j) = 1
                    Else
                      WordCount(j) + 1
                  EndIf
                Next
                NUniqueWords.l = j
                SetGadgetText(100, "File : " + FileName + EOL + "Lines : " + Str(NLines) + EOL + "Words : " + Str(NWords + 1) + EOL + "Unique words : " + Str(NUniqueWords + 1) + EOL + "Done in " + Str(GetTickCount_() - tz) + "ms")
                If CreateFile(0, "result.txt")
                    For z = 0 To NUniqueWords
                      WriteStringN(Str(z) + " " + UniqueWords(z) + Chr(9) + Chr(9) + Str(WordCount(z)))
                    Next
                    CloseFile(0)
                EndIf
                ShellExecute_(hWnd,"open","result.txt","","",#SW_SHOWNORMAL)
              Case 99
                Quit = #TRUE
            EndSelect
        EndSelect
      Until Quit
  EndIf
End

Francois Weil
14, rue Douer
F64100 Bayonne
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by Pupil.

Well, as fweil has posted his final open version i'll do the same.

@fweil i borowed some(all) of your GUI code(because i'm lazey), hope you don't mind.

Here's some performance numbers on fweils version and my version running on an AMD duron 800Mhz, 384MB and 7200 rpm HD:

Code: Select all

Test run on the King James's text file..
Version:       Lines:        Words:       Unique words:   Time to complete:
fweil's        75231         856121       14185           10765ms (approx 10.8 s)
My             100117        792079       13163           5208ms (approx 5.2 s)
I use lower case on all words found and fweil doesn't -that explains the big difference in the amount of unique words found in the two versions, i also discards numbers as words i.e. '12' is not considered a word. But my version has other weaknesses that makes it think, under certain conditions, that non-words are in fact a word...
The line differences.... fweil i think there are only two kinds of line separators, the UNIX kind(LF) and the DOS kind(CRLF). I'm pretty certain that my line count is correct, ultraedit and other programs tells me that it is anyway..

Enough of analyzing the result of the two versions, here's finally my last contribution(codewize anyway) to this forum thread:

Code: Select all

; Count words in a text file or whatever.
; By Pupil 2/11 -2002
;

Structure Chartype
  StructureUnion
    c.b
    a.b[2]
    w.w
  EndStructureUnion
EndStructure

Structure UniqueWordsType
  Word.s
  Count.l
EndStructure

Global UniqueWords.l, TotalWords, TotalLines.l
Dim hash.l(65536)
Dim charhash.b(255)
NewList WordCount.UniqueWordsType()

Declare ParseString(*ptr.CharType)
Declare SortWords()

; Sort the linked list and store result in an array.
Procedure SortWords()
DefType.b i, j
DefType.w count, comp
DefType.l arraycount
DefType.CharType *pcount, *pcomp
  Dim UniqueWordArray.UniqueWordsType(UniqueWords)
  arraycount = 0
  *pcount = @count : *pcomp = @comp
  For j = 0 To 'z'
      *pcount\a[0] = j
    For i = 0 To 'z'
      *pcount\a[1] = i
      If hash(count)
        ChangeCurrentElement(WordCount(), hash(count))
        hash(count) = 0 ; clear hash for next file
        Repeat
          UniqueWordArray(arraycount)\Word = WordCount()\Word
          UniqueWordArray(arraycount)\Count = WordCount()\Count
          arraycount + 1
          If NextElement(WordCount()) = 0
            *pcomp = @comp
          Else
            *pcomp = @WordCount()\Word
          EndIf
        Until *pcomp\w  count
      EndIf
    Next
  Next
  ClearList(WordCount()) ; clear linked list to prepare for next file
EndProcedure

; Count lines, words and unique words and
; store those unique ones in a list.
Procedure ParseString(*ptr.CharType)
DefType.l start, quit
DefType.s word
DefType.CharType *p, *d
  start = *ptr
  While *ptr\c  0
    If charhash(*ptr\c)
      start = *ptr
      Repeat
        *ptr+1
      Until charhash(*ptr\c) = #FALSE Or *ptr\c = 0
      word = LCase(PeekS(start, *ptr-start))
      If word  ""
        *p = @word
        TotalWords+1
        If hash(*p\w)
          ChangeCurrentElement(WordCount(), hash(*p\w))
          quit = #FALSE
          Repeat
            *d = @WordCount()\Word
            If WordCount()\Word = 'A' And i = 'a' And i <= 'z') Or i = 39
    charhash(i) = #TRUE
  Else
    charhash(i) = #FALSE
  EndIf
Next


; GUI borrowed from fweil, hope you don't mind..
Quit.l = #FALSE
EOL.s = Chr(13)+Chr(10)

If OpenWindow(0, 200, 200, 320, 240, #PB_Window_SystemMenu | #PB_Window_TitleBar, "Word Count")

  AddKeyboardShortcut(0, #PB_Shortcut_Escape, 99)

  If CreateMenu(0, WindowID())
    OpenSubMenu("General")
      MenuItem(11, "Open file")
      MenuItem(99, "Quit")
    CloseSubMenu()
  EndIf

  If CreateStatusBar(0, WindowID())
    StatusBarText(0, 0, "Idle ...", 0)
  EndIf

  If CreateGadgetList(WindowID())
    TextGadget(100, 10, 10, 300, 160, "")
  EndIf

  SetGadgetText(100, "Select a file to process ...")

  Repeat
    Select WaitWindowEvent()
      Case #PB_EventCloseWindow
        Quit = #TRUE
      Case #PB_EventMenu
        Select EventMenuID()
          Case 11
            fileName.s = OpenFileRequester("Select a file", "*.txt", "Text files|*.txt|All files|*.*", 0)
            tz.l = GetTickCount_()
            If ReadFile(0, filename)
              length.l = Lof()
              *buffer = AllocateMemory(0, length+2)
              If *buffer
                UniqueWords = 0 : TotalWords = 0 : TotalLines = 0
                ReadData(*buffer, length)
                ParseString(*buffer)
                FreeMemory(0)
              EndIf
              CloseFile(0)
              SortWords()
            EndIf
            SetGadgetText(100, "File: "+fileName+EOL+"Lines: "+Str(TotalLines)+EOL+"Words: "+Str(TotalWords)+EOL+"Unique wods: "+Str(UniqueWords)+EOL+"Done in "+Str(GetTickCount_() - tz)+"ms")
            If CreateFile(1, "UniqueWords.txt")
              WriteStringN("Lines: "+Str(TotalLines))
              WriteStringN("Words: "+Str(TotalWords))
              WriteStringN("Unique words: "+Str(UniqueWords))

              For i = 0 To UniqueWords-1
                WriteStringN(Str(i)+" "+UniqueWordArray(i)\Word+Chr(9)+Chr(9)+Str(UniqueWordArray(i)\Count))
              Next
              CloseFile(1)
              ShellExecute_(hWnd,"open","UniqueWords.txt","","",#SW_SHOWNORMAL)
            EndIf
          Case 99
            Quit = #TRUE
        EndSelect
    EndSelect
  Until Quit
EndIf
End
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by fweil.

Pupil,

You made a very nice example of how to use linked lists and pointers.

This would be a good code example for tricks'n tips and the ressource site. Thank you for sharing your knowledge.

KRgrds


Francois Weil
14, rue Douer
F64100 Bayonne
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by Midebor.

Hi,

As I have an older machine running Pentium, 120Mhz, 64K RAM
here are the benchmarks for both programs:

Test run on the King James's text file..
Version: Lines: Words: Unique words: Time to complete:
fweil's 31348 853725 14912 261214ms
Pupil 31347 820846 12879 63424ms

The diffenences in linecount and unique words is explained above
as the selctions differ but the time to complete shows a
SIGNIFICANT figure.

The pupose of thid post is not to enter in a polemic but
just to lurn about different appoches to a same problem
and find out that small timing differences on today's fast
machines can reveal significant differences old older computers.

Michel
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by fweil.

Nice from you to give such feedback Michel.

I suppose that the main difference between to code samples concern use of arrays (in mine) and linked lists (in Pupil's).

Pupil now better linked lists than I do and could code it well. I prefered to stay with my old arrays.

Then it makes a big difference in processing both codes, mainly because of the memory amount to be reserved at program start if I do not mistake (look at the memory consumed by both programs in the task manager).

As you have a 'small' ressource in CPU and memory, it probably explain such differences you found.

I suppose it is not only the CPU speed, but a mix with CPU speed, memory size and maybe cache memory and bus speed too which make the 'global performance' of a PC when benchmarking, and also the disk and interface speed for file save and re-open.

Using Pupil's source on my machine, I get a time a bit longer that he tells in his post, but maybe my disk is slower, or whatever else ... something is different.

One thing is true for sure, your PC has more or less 10 times slower CPU than mine, and 3 times less memory which explain you have more or less 30 times more time to process the same application (it is not exact in fact). Maybe the rule is more complicate because of the true available memory remaining when running the OS.

Anyway you are right to say we should always keep in mind we have so powerful machines that we sometime do not optimize at most our code. I think Pupil's one is the best from optimization point of view, obviously I would not have been able to write the same.

KRgrds

Francois Weil
14, rue Douer
F64100 Bayonne
BackupUser
PureBasic Guru
PureBasic Guru
Posts: 16777133
Joined: Tue Apr 22, 2003 7:42 pm

Post by BackupUser »

Restored from previous forum. Originally posted by naw.
Originally posted by cor

I'am also looking for a fast way to parse and replace the following by some code.
[%BACKGROUNDCOLOR%] must be replaced by some predefined or generated code.


[%BACKGROUNDCOLOR%]
Hi,
Well Netscape used to support JavaScript Entities which were very powerful means of adding in-line JS to HTML TAG values:
example:



pen1="#FF0000"
pen2="#FFFF00"
pen3="#FFFFFF"




Hello World


But, I'm not sure if MS ever adopted this technique (probably not because this is far too obvious and neat and MS only do clever things the hardest way possible). It certainly worked for Navigator 3 & 4. Dont know about other Browsers

You could also use in-line JS - but this is kind of a painful way to do it.

Style Sheets would really be the *proper* way to do it of course.
Post Reply