a set of procedures for KI´s (genetic algorithms)

Applications, Games, Tools, User libs and useful stuff coded in PureBasic
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

a set of procedures for KI´s (genetic algorithms)

Post by Hurga »

Hi together

I want to use a kind of KI in a game project, so I was in the need of a KI... (sounds logically, or...?) 8)

So I put my procedures together to make it easier to use for different projects.

The main idea was to implement a kind of genetic algorith.
The number of Inputs and outputs can be set freely.
There are up to 3 hidden neuronal layers possible (most of the time 1 is plenty enough)
The next generation can be made with some parameter which include...
- Crossover modes: singlel point, double point, multipoint and scramble
- Crossovrerate
- Mutationrate

There is also a proc to calc the similarity of two genoms. I didnt use this in the example, but in other, more complex situations it´s maybe usefull to kill some genoms with high score but which are too similar, ´cos this may prevent better results.

So here you can find it...
http://purebasic.stdojedmahr.de/schnips ... orithm.zip
(zipped, 6,3 k)

If includes one file which contains an (very simple) example how to use this.
There are objects that should move to an attractor....
The KI´s are stored with theier generation and score (make sure you create this Dir or change the path... otherwise it may not work well)

Feedbacks and bugreport (hope I found all, but .. who knows) are wellcome.

Maybe someone can make use of it.
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

Post by Hurga »

Hm, how should I interprete this "no replys"... :?

Is the code so bad that nobody find it worth a comment?

or is nobody interested in this?
Num3
PureBasic Expert
PureBasic Expert
Posts: 2812
Joined: Fri Apr 25, 2003 4:51 pm
Location: Portugal, Lisbon
Contact:

Post by Num3 »

I've tested your code!

It's quite interesting indeed, but it takes some time to read and understand ;)

Maybe you can make a more usable example for other coders to follow...
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

Post by Hurga »

OK, it tried to comment the example code a bit better, maybe that helps...

Code: Select all



; (better) commented example
; just a simple task -> move to the attractor and stay there
; in fact this example is easier to code directly... but anyway - its just to show how it works

; To train (and later to use) a AI, its important to score the fitness of the AI for the given task.
; For this, for each AI the \genAlgStrenght_I have to be set.
; The score should be as higher as better the Aim is reached.
; in this example the score is calculated, how far the position is away from the attractor


; for training, play around with these settings:
; Line 36 - the type of the AI (Hidden layers)
; line 22,23 how many KI´s are used
; line 202 - the settings how to create a new AI



; create the basic structure of a AI
; create Genoms
#MaxKI = 25 ; number of KI´s
#TrainingGenerations = 100
#MaxRounds = 800
#WND_Main = 0
#ScreenWidth = 1024
#ScreenHeight = 768


SavePath_S.s = "C:\temp\KITraining\" ; its important that the path exists (can be any path...)

Dim TestAI.BasisgenAlg(#MaxKI)

For x = 0 To #MaxKI
  ; Debug @TestAI(x)
  gA_New_genAlg(@TestAI(x), 0, 2, 2, 1, 10) ; play around with these settings for different KI´s
  ; gA_New_genAlg(Pointer to the new AI, genAlgArt_I.i, Inputs_I.i, Outputs_I.i, HiddenLayer_I.i, HiddenLayerNeurons_I.i)
  ; genAlgArt_I = can be used to store a value
  ; Inputs_I = number of imputs for the AI
  ; Outputs_I = how many outputs values give the AI
  ; HiddenLayer_I = how many layers are between input and output
  ; HiddenLayerNeurons_I = the number of "neurons" of each hidden layer - for simple tasks, no hL is nedded. Most of the times, more then 1 hL
  ; make it a lot more difficult to train a AI
  
  gA_RandomizeWeights(@TestAI(x))
Next


; for the task
Structure bng
  PosX_I.i
  PosY_I.i
  KI_nr_I.i
EndStructure
Global Dim Being.bng(#MaxKI)
Global Attraktor.POINT

If Not InitSprite() Or Not InitMouse() Or Not InitKeyboard()
  MessageRequester("Error", "Cant init Direct X")
  End
Else
  If OpenWindow(#WND_Main, 0, 0, #ScreenWidth, #ScreenHeight, ""); | #PB_Window_ScreenCentered)
    Result = OpenWindowedScreen(WindowID(#WND_Main), 0, 0, #ScreenWidth, #ScreenHeight, 0, 0, 0)
  EndIf
  
EndIf


Procedure DO_Test_Task(*Being.bng) ; let the AI do their job
  *genAlg.BasisgenAlg
  
  ; prepare the input - here you give your AI the inputs
  inputs(0) = GetAngleFromPointsGrad(*Being\PosX_I, *Being\PosY_I, Attraktor\x, Attraktor\y) / 360
  a = *Being\PosX_I - Attraktor\x
  b = *Being\PosY_I - Attraktor\y
  inputs(1) = Sqr(a*a + b*b) / 1000
  
  gA_MakeDecision(*Being\KI_nr_I, 1) ; let the AI make a decision -> found in Outputs(...)
  ; if sigmoid 1 = 1, the output-values are between -1 and 1, otherwise they are float values of any value
  ; make an interpretion of the output values - this can be whatever you want. In this case it defines the the direction and the speed
  n = Outputs(1) * 3
  t = Outputs(0) * 8
  Select t
    Case 0 ; up
      *Being\PosY_I + n
    Case 1
      *Being\PosY_I + n
      *Being\PosX_I + n
    Case 2 ; right
      *Being\PosX_I + n
    Case 3
      *Being\PosY_I - n
      *Being\PosX_I + n
    Case 4 ; down
      *Being\PosY_I - n
    Case 5
      *Being\PosY_I - n
      *Being\PosX_I - n
    Case 6; left
      *Being\PosX_I - n
    Case 7
      *Being\PosY_I + n
      *Being\PosX_I - n
  EndSelect
  
EndProcedure
Procedure Task_End(Counter_I.i) ; is the job done?
  Back_I.i
  *AI.BasisgenAlg
  
  ; arrival at attraktor or max rounds
  If Counter_I > #MaxRounds
    Back_I = 1
    ; Else ; score also how fast it was there
    ; For x = 0 To #MaxKI
    ; If Being(x)\PosX_I = Attraktor\x And Being(x)\PosY_I = Attraktor\y
    ; If *AI\genAlgStrenght_I = 0
    ; *AI\genAlgStrenght_I = #MaxRounds - counter
    ; EndIf
    ; EndIf
    ; Next
  EndIf
  
  ; Scoring
  If Back_I = 1
    For x = 0 To #MaxKI
      *AI = Being(x)\KI_nr_I
      a = Being(x)\PosX_I - Attraktor\x
      b = Being(x)\PosY_I - Attraktor\y
      
      *AI\genAlgStrenght_I = #MaxRounds - Sqr(a * a + b * b) - Counter_I  ; this setting is important
      If *AI\genAlgStrenght_I < 0
        *AI\genAlgStrenght_I = 0
      EndIf
    Next
  EndIf
  
  ProcedureReturn Back_I
EndProcedure


gA_Prepare_genAlg(TestAI(0)) ; here its just needed once, cos the KI´s are all the same type

For cycle = 0 To #TrainingGenerations ; as more complex the task, the more training cycles are nedded (normally)
  ; each AI get its task to do
  Ende_I = 0
  
  Attraktor\x = Random(#ScreenWidth- 20) + 10
  Attraktor\y = Random(#ScreenHeight-20) + 10
  For x = 0 To #MaxKI
    Being(x)\PosX_I = Random(#ScreenWidth- 20) + 10
    Being(x)\PosY_I = Random(#ScreenHeight-20) + 10
    Being(x)\KI_nr_I = @TestAI(x) ; give each individual an AI
  Next
  Counter_I = 0
  
  Repeat ; how many rounds... after that, the KI´s are scored
    For x = 0 To #MaxKI
      
      DO_Test_Task(@Being(x)) ; the training is more or less simple.... here it does its job
      ;
      If Task_End(Counter_I) = 1
        Ende_I = 1
      EndIf
      
    Next
    
    If Counter_I % 10 = 0 ; speed up the process
      ;{ display
      ClearScreen($000000)
      If StartDrawing(ScreenOutput())
          For x = 0 To #MaxKI
            Circle(Being(x)\PosX_I, Being(x)\PosY_I, 5, $C0C0C0)
          Next
          Circle(Attraktor\x, Attraktor\y, 5, $0000FF)
          DrawingMode(#PB_2DDrawing_Transparent)
          DrawText(5, 5, "Cyc: " + Str(cycle) + " - round" + Str(Counter_I), $00FFFF)
          
        StopDrawing()
      EndIf
      
      FlipBuffers()
      Delay(1)
      ;}
    EndIf
    
    Counter_I + 1
  Until Ende_I
  
  
  ; now the old KI´s saved to disk (with their score)
  ; later on you can pick one (or more) of these KI´s to use it in you program
  ; create the next generation of KI´s
  For x = 0 To #MaxKI ; save current ones
    gA_Save_genAlg(SavePath_S + Str(cycle) + "-" + Str(TestAI(x)\genAlgStrenght_I) + #genAlgExtension, @TestAI(x))
    TestAI(x)\genAlgStrenght_I = 0
  Next
  
  gA_Init_Reproduction(SavePath_S, cycle) ; prepare the parent-data - it reads all the KI´s with a score of at least 1
  
  For x = 0 To #MaxKI ; and here it creats a new set of KI´s
    gA_Create_Child_genAlg(@TestAI(x), 0, 40, 2, 10, 0) ; play around with these settings for different evolution
    ; with this proc, there are some parameter that may be interesting to vary...
    ; gA_Create_Child_genAlg(the pointer to the new AI, SelectionMode_I.i, Elitism_I.i, CrossoverMode_I.i, Crossoverrate_F.f, MutationRate_F.f)
    ; selection Mode: 0=random; 1=roulettewheel (as higher the score as greater the cange to be a parent of the new AI)
    ; Elitism_I (0-100) - not all KI´s are used for the roulettewheel, but the Eitism% of the best scored.
    ; CrossoverMode_I (the way of mixing the genoms of both parents)  1=one cut  2=2 cuts  3= multi cuts  4=scramble (a better change of really different values)
    ; Crossoverrate_F = percentage of doing a crossover (20-40% is a good value most of the times
    ; MutationRate_F = the change of random mutataion in the genes (0-100). a good value is around 0.05
  Next
  
  gA_End_Reproduction() ; release it
  
Next

End 
Last edited by Hurga on Tue Mar 17, 2009 12:53 pm, edited 1 time in total.
LCD
Enthusiast
Enthusiast
Posts: 206
Joined: Sun Jun 01, 2003 10:55 pm
Location: Austria, Vienna
Contact:

Post by LCD »

OT: KI is german (Kuenstliche Intelligenz), in english it should be AI (Artifical Intelligence).
My PC
Ryzen 9 5950, 64 GB RAM, nVidia RTX A4000, Win 10
Ryzen 7 1700, 32 GB RAM, nVidia RTX A2000, Win 10
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

Post by Hurga »

Jepp, your right.

Changed it in the post above. Thx
Kelebrindae
Enthusiast
Enthusiast
Posts: 151
Joined: Tue Apr 01, 2008 3:23 pm

Post by Kelebrindae »

@Hurga:
That's brilliant! Very interesting piece of code you've got here, Hurga. :D

I've encountered a little bug while fiddling with the "gA_Create_Child_genAlg" parameters in your sample code, I've got a "null pointer" error at line 482 of the ".pbi" when I set the first parameter (selection mode) to 1 ( = "roulette wheel"); this happens just at the end of the first cycle.

I don't know much about AI, but this first parameter makes me wonder: what's the point of a genetic algorithm without a "roulette wheel" mechanism ? I thought that giving a better chance of getting a child to the "fittest" instances was precisely the core of genetic algo. :?
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

Post by Hurga »

@Kelebrindae
Thx for the flowers ;-)

The roulette wheel selection is one method of finding two parents. But its not always the best way.
as an example... Suggest you start with 100 AI´s. 99 AI´s fail -> 0 points.
If you have a AI which makes a really good score, lets say 40 (from 100 possible). If this AI is now the "father" of all others, there is no (or a very small) chance to get a way better score. Its a so called local maximum, but not the overall maximum. (english is not my native, so its a bit hard to explain the concept). Hope you understand

I´ll go and check the bug.
Thx so far

//EDIT:
I changed the code to prevent such "pointer 0 " - problems. Uploaded it again (see first post)
Should work now (better)
Kelebrindae
Enthusiast
Enthusiast
Posts: 151
Joined: Tue Apr 01, 2008 3:23 pm

Post by Kelebrindae »

Yes, I think I understand. (and don't worry about your english: it's not my native language either, so I'm nowhere near perfect in this area and I'm not overly disturbed by distorted grammar - most of the time, I can't even tell the difference... :wink: ).

Thanks for the bug fix. I'm gonna try to whip up a little sample or two, just to see if I understand well how your code works and what I can make of it.


Speaking of genetic algorithms: do you think your code could manage this kind of task (or a simplified version of this task) ?
http://rogeralsing.com/2008/12/07/genet ... mona-lisa/
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

Post by Hurga »

Yes, I think I understand. (and don't worry about your english: it's not my native language either, so I'm nowhere near perfect in this area and I'm not overly disturbed by distorted grammar - most of the time, I can't even tell the difference... Wink ).
Hu, that makes my heart lighter 8)

did you see his
http://rogeralsing.com/2008/12/09/genet ... -lisa-faq/
and
http://rogeralsing.com/2008/12/11/genet ... -binaries/ (here you can find his source - maybe wise to study it)

@ your question
Hm, theoretically it may work...
practically... maybe this way

A bit more practically..
Set the input to 0.5 or 1
No hidden layer
350 outputs without sigmoid smoothing
each 2 outputs are treated as a coordinate, 3 coords are used for a polygon
the 7th output is used as color


Scoring
He used a pixel-wise comparision for calculation of the fitness score. Should be no problem in PB

Maybe you shouldnt use the child - creation, either take the best fitting AI, copy it twice and mutate it (like he did)
But it should work with let say 10 AI and the child creation.
It tooks him about 1000000 cycles, so the time for reading the old AIs in may increase dramatically. So better take the way not to save the AI´s. Take the best one, multipl it x times and then mutate them a little bit and let them score...

I´m eager to see your try...
Kelebrindae
Enthusiast
Enthusiast
Posts: 151
Joined: Tue Apr 01, 2008 3:23 pm

Post by Kelebrindae »

@Hurga
Hi,

Here's a first test (and maybe it's a bit too complicated for a first test, especially considering my abilities... :roll: ).

It's a simplified version of the "Mona Lisa with transparent polys" (cf. above)
The task is to reproduce a 150x150 black and white image, using a set of 50 boxes drawn with the "#PB_2DDrawing_XOr" setting; it allows fast drawing and fast image comparaison.

The results, I must say, are not very impressive: I've probably used your procedures incorrectly somewhere... :?

On the other hand, I use a relatively small number of generations (100-200), so maybe it's normal after all... The reason why I use so few generations is that the program becomes increasingly slow, because of the writing/reading of the AIs to disk.

Two questions, thus:
- Could you please take a look at the code and tell me if I use your procedures correctly ?
- Do you think I could modify the "gA_Load_genAlg" and "gA_Save_genAlg" procs to read/write AIs to memory (instead of using the disk) to speed up the process?

Here's my code:

Code: Select all

; GUI constants
#Principale = 1
Enumeration
  #IM_ref
  #IM_preview
  #FL_similarity
  #BT_close
EndEnumeration

; Reference and generated images
#REFIMAGENUM = 0
#GENIMAGENUM = 1

; Image size constants
#SIZE = 150
#SIZE2 = #SIZE*#SIZE


;- Fonts
Global FontID1,FontID2
FontID1 = LoadFont(1, "Arial", 11)

; Pointers for image comparison
Global *refImage,*newImage

Structure myBITMAPINFO
  bmiHeader.BITMAPINFOHEADER
  bmiColors.RGBQUAD[1]
EndStructure


; create the basic structure of a AI
; create Genoms
#MaxAI = 10 ; number of AI´s
#TrainingGenerations = 100
#NBPOLY = 50
#NBOUTPUTS = #NBPOLY*4

IncludeFile "GeneticAlgorithm.pbi"
SavePath_S.s = "" ; its important that the path exists (can be any path...)

Global Dim TestAI.BasisgenAlg(#MaxAI)

; to display an AI outputs
Structure drawing_struct
  poly.i[#NBPOLY*4]
  *AI_nr_I.BasisgenAlg
  score.i
EndStructure
Global Dim being.drawing_struct(#MaxAI)


;- ----- Procedures ------
Procedure Open_Window()
  
  LoadImage(#REFIMAGENUM,"refimage.bmp")
  If ImageWidth(#REFIMAGENUM) <> #SIZE Or ImageHeight(#REFIMAGENUM) <> #SIZE
    MessageRequester("Error", "The reference image must be a black and white #SIZEx#SIZE bmp!")
    End
  EndIf
  
  CreateImage(#GENIMAGENUM,#SIZE,#SIZE,16)
   
  If OpenWindow(#Principale, 220, 0, #SIZE*2+100, #SIZE+80, "Test XorDraw",  #PB_Window_SystemMenu | #PB_Window_MinimizeGadget | #PB_Window_TitleBar)
    
      ButtonGadget(#BT_close, WindowWidth(#Principale)/2-50, WindowHeight(#Principale)-40, 100, 30, "Fermer", #PB_Button_Default)
      SetGadgetFont(#BT_close, FontID1)
      
      ImageGadget(#IM_ref,20,10,#SIZE,#SIZE,ImageID(#REFIMAGENUM))
      ImageGadget(#IM_preview,70 + #SIZE,10,#SIZE,#SIZE,ImageID(#GENIMAGENUM))
      
      StringGadget(#FL_similarity, GadgetX(#IM_preview), GadgetY(#IM_preview) + GadgetHeight(#IM_preview) + 2, #SIZE, 25, "", #PB_String_ReadOnly | #PB_String_BorderLess)
    
  EndIf

EndProcedure

Procedure DrawImg(*being.drawing_struct,numimage)
  Protected offset.i,i.i

  StartDrawing(ImageOutput(numimage))    
    DrawingMode(#PB_2DDrawing_Default)
    Box(0,0,#SIZE,#SIZE,0)
    DrawingMode(#PB_2DDrawing_XOr)
    For i=1 To #NBPOLY
      Box(*being\poly[offset],*being\poly[offset+1],*being\poly[offset+2],*being\poly[offset+3],$FFFFFF)
      offset + 4
    Next i     
    DrawingMode(#PB_2DDrawing_Default)
  StopDrawing()
  
EndProcedure

; Returns the number of identical pixels between the reference image and the generated image => the higher, the better
; NB: with B&W images, each pixel can take only 2 values => a comparaison between a reference image and a purely random
;     one tends to return a 50% similarity (which is #SIZE/2). 100% means images are the same, 0% means image2 is a
;     negative version of reference image.
Procedure.f CompareImg(refimage.i,imageNum.i)
  Protected i.i,j.i,dif.i
  Protected hDC.i,hBmp.i,picl_X.i,picl_Y.i,picl_D.i
  Protected bmi.myBITMAPINFO
  Protected *newpixel.LONG,*refpixel.LONG

  ; When called for the first time, the procedure stores the reference image in memory
  If *refIMage=0
    hDC=StartDrawing(ImageOutput(refimage))
    hBmp = ImageID(refimage)
    picl_X = ImageWidth(refimage)
    picl_Y = ImageHeight(refimage)
    picl_D = ImageDepth(refimage)  

    ; allocate memory for reference image and new image    
    *refImage = AllocateMemory(picl_X*picl_Y*4)
    *newImage = AllocateMemory(picl_X*picl_Y*4)
  
    bmi.myBITMAPINFO
    bmi\bmiHeader\biSize   = SizeOf(BITMAPINFOHEADER)
    bmi\bmiHeader\biWidth  = picl_X
    bmi\bmiHeader\biHeight = picl_Y
    bmi\bmiHeader\biPlanes = 1
    bmi\bmiHeader\biBitCount = 32
    bmi\bmiHeader\biCompression = #BI_RGB
    GetDIBits_(hDC,hBmp,1,picl_Y,*refImage,bmi,#DIB_RGB_COLORS)
    StopDrawing()
  EndIf

  ; Converts new image to a memory block, for faster comparison
  hDC=StartDrawing(ImageOutput(imageNum))
  hBmp = ImageID(imageNum)
  picl_X = ImageWidth(imageNum)
  picl_Y = ImageHeight(imageNum)
  picl_D = ImageDepth(imageNum)  
  
  bmi.myBITMAPINFO
  bmi\bmiHeader\biSize   = SizeOf(BITMAPINFOHEADER)
  bmi\bmiHeader\biWidth  = picl_X
  bmi\bmiHeader\biHeight = picl_Y
  bmi\bmiHeader\biPlanes = 1
  bmi\bmiHeader\biBitCount = 32
  bmi\bmiHeader\biCompression = #BI_RGB
  GetDIBits_(hDC,hBmp,1,picl_Y,*newImage,bmi,#DIB_RGB_COLORS)
  StopDrawing()
  
  ; Count every pixel which is different from reference image
  *newpixel = *newImage
  *refpixel = *refImage
  picl_X-1:picl_Y-1
  For i=0 To picl_X
    For j=0 To picl_Y
      If *newpixel\l <> *refpixel\l
        dif+1
      EndIf

      *newpixel+4
      *refpixel+4
    Next j
  Next i
  
  ; Return this number
  ProcedureReturn #SIZE2 - dif
 
EndProcedure


Procedure DO_Test_Task(*Being.drawing_struct) ; let the AI do their job
  Protected i.i
  Protected *genAlg.BasisgenAlg
 
  ; prepare the input - here you give your AI the inputs
  ; In this case, I don't know what to give as input
 
  gA_MakeDecision(*Being\AI_nr_I, 0) ; let the AI make a decision -> found in Outputs(...)
  ; if sigmoid 1 = 1, the output-values are between -1 and 1, otherwise they are float values of any value
  ; make an interpretion of the output values - this can be whatever you want. 
  ; In this case, each set of 4 outputs defines a box => 200 outputs = 50 boxes

  For i=0 To #NBOUTPUTS-1
    *Being\poly[i]=outputs(i)
  Next i
 
EndProcedure

Procedure Task_End() ; is the job done?
  Protected bestScore.i,bestAi.i
  Protected *AI.BasisgenAlg
 
  ; Scoring: a higher score means a more similar image
  For x = 0 To #MaxAI
    *AI = Being(x)\AI_nr_I
    drawImg(@Being(x),#GENIMAGENUM)
   
    ; a random image tends to return a 50% similarity (cf. comments in CompareImg)
    ; => any score below 50% is set to 0
    Being(x)\score = compareImg(#REFIMAGENUM,#GENIMAGENUM) - #SIZE2/2    
    If Being(x)\score < 0
      Being(x)\score = 0
    EndIf
    *AI\genAlgStrenght_I =  Being(x)\score 
    
    If Being(x)\score > bestScore
      bestScore = Being(x)\score
      bestAi = x
    EndIf
  Next
  
  ProcedureReturn bestAi
EndProcedure

;- ----- Main Block ------
;- initialization
Open_Window()

;- AI initialization
For x = 0 To #MaxAI
  ; Debug @TestAI(x)
  gA_New_genAlg(@TestAI(x), 0, 1, #NBOUTPUTS, 0, 0) ; play around with these settings for different AI´s
  ; gA_New_genAlg(Pointer to the new AI, genAlgArt_I.i, Inputs_I.i, Outputs_I.i, HiddenLayer_I.i, HiddenLayerNeurons_I.i)
  ; genAlgArt_I = can be used to store a value
  ; Inputs_I = number of inputs for the AI
  ; Outputs_I = how many outputs values give the AI
  ; HiddenLayer_I = how many layers are between input and output
  ; HiddenLayerNeurons_I = the number of "neurons" of each hidden layer - for simple tasks, no hL is nedded. Most of the times, more then 1 hL
  ; make it a lot more difficult to train a AI
 
  gA_RandomizeWeights(@TestAI(x))
Next

;- Prepare AIs
gA_Prepare_genAlg(TestAI(0)) ; here its just needed once, cos the AI´s are all the same type

;- main loop
Repeat
  EventID = WindowEvent() 
  
  Select EventID
    Case 0
      If cycle < #TrainingGenerations
        ;  give each individual an AI
        For x = 0 To #MaxAI
          Being(x)\AI_nr_I = @TestAI(x)
        Next x
        Counter_I = 0
       
        ; do the job
        For x = 0 To #MaxAI
          DO_Test_Task(@Being(x))
        Next x 
       
        ; Evaluate the AIs and show the best one's results
        theBestOne = Task_End()
        drawImg(@Being(theBestOne),#GENIMAGENUM)
        SetGadgetState(#IM_preview,ImageID(#GENIMAGENUM))
        SetGadgetText(#FL_similarity, "Cycle "+Str(cycle)+", Similarity = " + StrF(50 + (Being(theBestOne)\score / #SIZE2)*100,2) + "%")
        EventWindow()
        Delay(1)
       
        ; now the old AI´s saved to disk (with their score)
        ; later on you can pick one (or more) of these AI´s to use it in you program
        ; create the next generation of AI´s
        For x = 0 To #MaxAI ; save current ones
          gA_Save_genAlg(SavePath_S + Str(cycle) + "-" + Str(TestAI(x)\genAlgStrenght_I) + #genAlgExtension, @TestAI(x))
          TestAI(x)\genAlgStrenght_I = 0
        Next x
       
        gA_Init_Reproduction(SavePath_S, cycle) ; prepare the parent-data - it reads all the AI´s with a score of at least 1
       
        For x = 0 To #MaxAI ; and here it creats a new set of AI´s
          gA_Create_Child_genAlg(@TestAI(x), 0, 20, 2, 20, 0.05) ; play around with these settings for different evolution
          ; with this proc, there are some parameter that may be interesting to vary...
          ; gA_Create_Child_genAlg(the pointer to the new AI, SelectionMode_I.i, Elitism_I.i, CrossoverMode_I.i, Crossoverrate_F.f, MutationRate_F.f)
          ; selection Mode: 0=random; 1=roulettewheel (as higher the score as greater the cange to be a parent of the new AI)
          ; Elitism_I (0-100) - not all AI´s are used for the roulettewheel, but the Elitism% of the best scored.
          ; CrossoverMode_I (the way of mixing the genoms of both parents)  1=one cut  2=2 cuts  3= multi cuts  4=scramble (a better change of really different values)
          ; Crossoverrate_F = percentage of doing a crossover (20-40% is a good value most of the times
          ; MutationRate_F = the change of random mutataion in the genes (0-100). a good value is around 0.05
        Next x
       
        gA_End_Reproduction() ; release it
        cycle+1
        Delay(1)
      EndIf
  
    Case #PB_Event_CloseWindow
      quit = #True
      
    Case #PB_Event_Gadget
      numClickedGadget = EventGadget()
      Select numClickedGadget 
        Case #BT_close
          quit = #True
      EndSelect

  EndSelect

Until quit = #True
and the version of your include I've used:

Code: Select all

;Info: KI Entscheidung und Training
; Include for handling of genetic algorithms
; (c) 2009 by Stefan Mahr (aka Dostej)

; These procs allow the creation and training of AI´s as genes
; needs Math.pbi - include

; these constants should may be set as needed
#genAlgExtension = ".genAlg"
#FileKI = 1

; structures
Structure genAlgSelect
  Path_S.s
  Max_I.i
  Average_I.i
  MaxKumm_I.i
  AverageKumm_I.i
EndStructure
Structure BasisgenAlg
  genAlgArt_I.i ; can be used to mark different types of gA´s
  genAlgStrenght_I.i ; the fitness score of the genAlg
  Inputs_I.i ; how many inputs have this genAlg
  Outputs_I.i ; how many outputs have this genAlg
  HiddenLayer_I.i ; how many hidden layers ; max 3
  HiddenLayerNeurons_I.i ; how many neurons should each Hidden Layer have
  ; the pointer to Mem, where the Weights are stored
  HiddenLayerWeight_P.i[3]
  OutputWeight_P.i ; if 0, the procs suggest, that no mem is reserved yet
  
EndStructure
Global NewList genAlg.BasisgenAlg() ; here are all genAlg´s genes stored

; these are the basic Arrays for the genetic algorithms (genAlg´s) , will be redimed while running
Global Dim Inputs.f(1) ; its easier If they are between -1 And 1
Global Dim Outputs.f(1)
Global Dim HiddenLayer.f(2, 1)
Global NewList genAlgListAll.genAlgSelect() ; stores the genoms from this family
Global Kummulated_GI.i ; value for the roulette wheel selection


;/ genAlg CORE
Procedure.f Sigmoid(f.f) ; returns a value between 0 and 1
  ProcedureReturn 1.0/(1.0 + Pow(2.71828, -f)) ; normally -f/p ; if p > 1 , the curve will be flatter
EndProcedure
Procedure.i AbsL(Value_L.i) ; returns the absolute value
  If Value_L < 0
    Value_L = -Value_L
  EndIf
  ProcedureReturn Value_L
EndProcedure
Procedure.i GetAngleFromPointsGrad(x1.i,y1.i,x2.i,y2.i) ; returns the angle (0-360)
  w.i = ATan((y2 - y1) / (x2 - x1)) * 57.295776
  If x2 < x1
    w = 180 + w
  EndIf
  If w < 0 : w + 360 : EndIf
  If w > 360 : w - 360 : EndIf
  
  ProcedureReturn w 
EndProcedure



; Create, load, save, delete genAlg
Procedure gA_AllocMem(*genAlg.BasisgenAlg)
  ; alloc the mem
  If *genAlg\HiddenLayer_I = 0 ; without hidden layer
    *genAlg\OutputWeight_P = AllocateMemory(*genAlg\Outputs_I * (*genAlg\Inputs_I+1) << 2)
    
  Else ;with Hidden layer
    *genAlg\OutputWeight_P = AllocateMemory(*genAlg\Outputs_I * (*genAlg\HiddenLayerNeurons_I+1) << 2)
    ; first hidden layer
    *genAlg\HiddenLayerWeight_P[z] = AllocateMemory((*genAlg\HiddenLayerNeurons_I+1) * (*genAlg\Inputs_I+1) << 2)
    For z = 1 To *genAlg\HiddenLayer_I-1 ; other hidden layers
      *genAlg\HiddenLayerWeight_P[z] = AllocateMemory((*genAlg\HiddenLayerNeurons_I+1) * (*genAlg\HiddenLayerNeurons_I+1) << 2)
    Next 
  EndIf
  
EndProcedure
Procedure gA_New_genAlg(*genAlg.BasisgenAlg, genAlgArt_I.i, Inputs_I.i, Outputs_I.i, HiddenLayer_I.i, HiddenLayerNeurons_I.i) ; reserves the mem for a new genAlg gens and randomize the genes
  
  *genAlg\genAlgArt_I = genAlgArt_I
  *genAlg\HiddenLayer_I = HiddenLayer_I
  *genAlg\HiddenLayerNeurons_I = HiddenLayerNeurons_I
  *genAlg\genAlgStrenght_I = 0
  *genAlg\Inputs_I = Inputs_I
  *genAlg\Outputs_I = Outputs_I
  
  gA_AllocMem(*genAlg)
  
EndProcedure

Procedure gA_Save_genAlg(File_S.s, *genAlg.BasisgenAlg) ; save the current genAlg
  If CreateFile(#FileKI, File_S + #genAlgExtension)
    WriteData(#FileKI, *genAlg, SizeOf(BasisgenAlg)) ; write the structure
    
    ; write the Weights
    If *genAlg\HiddenLayer_I > 0
      WriteData(#FileKI, *genAlg\OutputWeight_P, *genAlg\Outputs_I * (*genAlg\HiddenLayerNeurons_I+1) << 2)
      WriteData(#FileKI, *genAlg\HiddenLayerWeight_P[0], (*genAlg\HiddenLayerNeurons_I+1) * (*genAlg\Inputs_I+1) << 2)
      ; Debug \HiddenLayer_I
      For x = 1 To *genAlg\HiddenLayer_I-1
        WriteData(#FileKI, *genAlg\HiddenLayerWeight_P[x], (*genAlg\HiddenLayerNeurons_I+1) * (*genAlg\HiddenLayerNeurons_I+1) << 2) ; for each neuron a weight 
      Next
      
    Else
      WriteData(#FileKI, *genAlg\OutputWeight_P, *genAlg\Outputs_I * (*genAlg\Inputs_I+1) << 2)
      
    EndIf 
    
    CloseFile(#FileKI)
  EndIf
EndProcedure
Procedure gA_Load_genAlg(File_S.s, *genAlg.BasisgenAlg) ; loads a genAlg from File and returns 1 if ok
  Back_I.i = 0
  OK_I.i = 0
  check.BasisgenAlg
  
  If ReadFile(#FileKI, File_S)
    ReadData(#FileKI, @check, SizeOf(BasisgenAlg)) ; Read the structure-date
    
    ; check is the memory reserved of *genAlg
    If *genAlg\OutputWeight_P = 0 ; not initialized - > do it
      gA_New_genAlg(*genAlg, check\genAlgArt_I, check\Inputs_I, check\Outputs_I, check\HiddenLayer_I, check\HiddenLayerNeurons_I)
      OK_I = 1
    Else ; check if compatible -> not 100% secure, for that a memorysize-test would be better - but this is simpler  :-)
      If *genAlg\Inputs_I = check\Inputs_I And *genAlg\Outputs_I = check\Outputs_I
        If *genAlg\HiddenLayer_I = check\HiddenLayer_I And *genAlg\HiddenLayerNeurons_I = check\HiddenLayerNeurons_I And *genAlg\genAlgArt_I = check\genAlgArt_I
          OK_I = 1
        EndIf 
      EndIf 
    EndIf
    
    If OK_I 
      If *genAlg\HiddenLayer_I > 0
        ReadData(#FileKI, *genAlg\OutputWeight_P, *genAlg\Outputs_I * (*genAlg\HiddenLayerNeurons_I+1) << 2)
        ReadData(#FileKI, *genAlg\HiddenLayerWeight_P[0], (*genAlg\HiddenLayerNeurons_I+1) * (*genAlg\Inputs_I+1) << 2)
        ; Debug \HiddenLayer_I
        For x = 1 To *genAlg\HiddenLayer_I-1
          ReadData(#FileKI, *genAlg\HiddenLayerWeight_P[x], (*genAlg\HiddenLayerNeurons_I+1) * (*genAlg\HiddenLayerNeurons_I+1) << 2) ; for each neuron a weight 
        Next 
      Else
        ReadData(#FileKI, *genAlg\OutputWeight_P, *genAlg\Outputs_I * (*genAlg\Inputs_I+1) << 2)
      EndIf 
      CloseFile(#FileKI)
      Back_I = 1
    EndIf
  Else
    CallDebugger
  EndIf
  
  ProcedureReturn Back_I
EndProcedure
Procedure gA_Free_genAlgMem(*genAlg.BasisgenAlg) ; release the mem of the current genAlg
  
  For x = 0 To 2
    If *genAlg\HiddenLayerWeight_P[x]
      FreeMemory(*genAlg\HiddenLayerWeight_P[x])
    EndIf
  Next
  If *genAlg\OutputWeight_P
    FreeMemory(*genAlg\OutputWeight_P)
  EndIf
  
EndProcedure


; Decision
Procedure gA_Prepare_genAlg(*genAlg.BasisgenAlg) ; set the dimension for the NN-Arrays
  Dim Inputs.f(*genAlg\Inputs_I) ; the last one is the bias
  Inputs(*genAlg\Inputs_I) = -1 ; set the bias
  
  If *genAlg\HiddenLayer_I > 0
    Dim HiddenLayer.f(*genAlg\HiddenLayer_I-1, *genAlg\HiddenLayerNeurons_I) ; the last one is the bias
    For x = 0 To *genAlg\HiddenLayer_I-1
      HiddenLayer(x, *genAlg\HiddenLayerNeurons_I) = -1 ; set the bias
    Next
  EndIf
  Dim Outputs.f(*genAlg\Outputs_I)
  
EndProcedure
Procedure gA_MakeDecision(*genAlg.BasisgenAlg, Sigmoid_I.i) ; lets the current KI make its decision; if Sigmoid_I = 1, the output is between _1 and 1
  *f.Float
  f.f
  
  If *genAlg\HiddenLayer_I > 0
    *f = *genAlg\HiddenLayerWeight_P[0]
    For x = 0 To *genAlg\HiddenLayerNeurons_I  ; calc Hidden Layer 0 from the input values
      f = 0
      For y = 0 To *genAlg\Inputs_I
        f = f + Inputs(y) * *f\f
        *f + 4
      Next
      HiddenLayer(0, x) = f
    Next
    
    ; calc the additional Hidden Layers
    For x = 1 To *genAlg\HiddenLayer_I-1
      *f = *genAlg\HiddenLayerWeight_P[x]
      For y = 0 To *genAlg\HiddenLayerNeurons_I ; calc the neuron 
        f = 0
        For z = 0 To *genAlg\HiddenLayerNeurons_I ; get the Neurons of the HL under it
          f = f + HiddenLayer(x-1, z) * *f\f
          *f + 4
        Next
        HiddenLayer(x, y) = f
      Next
    Next
    
    ; calc the Output
    *f = *genAlg\OutputWeight_P
    For x = 0 To *genAlg\Outputs_I-1
      f = 0
      For y = 0 To *genAlg\HiddenLayerNeurons_I ; calc the neuron from the last hidden layer
        f = f + HiddenLayer(*genAlg\HiddenLayer_I-1, y) * *f\f
        *f + 4
      Next
      If Sigmoid_I
        Outputs(x) = Sigmoid(f)
      Else
        Outputs(x) = f * 100
      EndIf
    Next
    
  Else ; no hidden layer -> calc the output from the inputs
    ; calc the Output
    *f = *genAlg\OutputWeight_P
    For x = 0 To *genAlg\Outputs_I-1
      f = 0
      For y = 0 To *genAlg\Inputs_I ; calc the neuron from the last hidden layer
        f = f + Inputs(y) * *f\f
        *f + 4
      Next
      If Sigmoid_I
        Outputs(x) = Sigmoid(f)
      Else
        Outputs(x) = f * 100
      EndIf
    Next 
  EndIf
  
EndProcedure

; Generativity
Procedure.i gA_Init_Reproduction(Path_S.s, Epoche_I.i) ; read all the previous genoms into LIST genAlgListAll(), returns <> 0 if ok
  Back_I.i
  Kumm_average_I.i
  s.s
  
  ; get the List of all genAlg´s with their Score and cummulate the Scores
  s = Right(#genAlgExtension, Len(#genAlgExtension)-1)
  
  For x = 0 To Epoche_I
    If ExamineDirectory(0, Path_S, "*.*")
      While NextDirectoryEntry(0)
        If DirectoryEntryType(0) = #PB_DirectoryEntry_File
          If GetExtensionPart(DirectoryEntryName(0)) = s
            ; If StringField(DirectoryEntryName(0), 1, "-") = *Mutation\Name_S
            If Val(StringField(DirectoryEntryName(0), 1, "-")) > 0
              AddElement(genAlgListAll())
              genAlgListAll()\Path_S = Path_S + DirectoryEntryName(0)
              genAlgListAll()\Average_I = Val(StringField(DirectoryEntryName(0), 1, "-")) ; the score of the genAlg
              Kumm_average_I + genAlgListAll()\Average_I
              genAlgListAll()\AverageKumm_I = Kumm_average_I
            EndIf
            ; EndIf
          EndIf
        EndIf
      Wend
      FinishDirectory(0)
    EndIf
  Next
  
  If ListSize(genAlgListAll()) > 0
    Back_I = 1
    SortStructuredList(genAlgListAll(), #PB_Sort_Descending, OffsetOf(genAlgSelect\Average_I), #PB_Sort_Integer)
    Kummulated_GI = Kumm_average_I
  EndIf
  
  
  ProcedureReturn Back_I
EndProcedure
Procedure gA_End_Reproduction() ; releases the data used
  ClearList(genAlgListAll())
  
EndProcedure

Procedure.i gA_CalcSimilarity(*genAlg.BasisgenAlg, *genAlg2.BasisgenAlg) ; return the grade of similarity between these two genoms in %
  Back_I.i
  *f.Float
  *f2.Float
  
  ; calc the difference
  For x = 0 To 2
    If *genAlg\HiddenLayerWeight_P[x]
      c = MemorySize(*genAlg\HiddenLayerWeight_P[x])
      d + c
      t = c>>2 - 1
      For y = 0 To t
        *f = *genAlg\HiddenLayerWeight_P[x] + y << 2
        *f2 = *genAlg2\HiddenLayerWeight_P[x] + y <<2
        If Abs(*f\f - *f2\f) < 0.15
          Back_I + 1
        EndIf
      Next
    EndIf
  Next
  
  c = MemorySize(*genAlg\OutputWeight_P)
  d + c
  t = c>>2 - 1
  For y = 0 To t
    *f = *genAlg\OutputWeight_P + y <<2
    *f2 = *genAlg2\OutputWeight_P + y <<2
    If AbsL(*f\f - *f2\f) < 0.15
      Back_I + 1
    EndIf
  Next
  
  ; calc the percentual difference
  d >> 2 ; how many floats are in these genom
  Back_I = (Back_I * 100) / d
  
  ProcedureReturn Back_I
EndProcedure

Procedure.i gA_RandomizeWeights(*genAlg.BasisgenAlg) ; set all weights of the given KI to randomized values (-1 to 1) ; returns 1 if ok
  Back_I.i = 1
  Iteration_I.i
  *f.Float
  f.f
  
  ; CallDebugger
  ; Create the HiddenLayerWeights - randomized
  For z = 1 To *genAlg\HiddenLayer_I
    x = z-1
    If *genAlg\HiddenLayerWeight_P[x] ; if mem is reserved
      Iteration_I = MemorySize(*genAlg\HiddenLayerWeight_P[x])>>2 - 1
      *f = *genAlg\HiddenLayerWeight_P[x]
      For y = 0 To Iteration_I
        f = (Random(32767)- 16383) / 16383 ; create values between -1 and 1
        *f\f = f ; didnt work directly - maybe a bug
        *f + 4
      Next
      
    Else
      Back_I = 0
      Break
      
    EndIf
  Next
  
  ; create the OutputWeights
  If *genAlg\OutputWeight_P
    Iteration_I = MemorySize(*genAlg\OutputWeight_P)>>2 - 1
    *f = *genAlg\OutputWeight_P
    For x = 0 To Iteration_I
      *f\f = (Random(32767)- 16383) / 32767 ; create values between -1 and 1
      *f + 4
    Next
  Else
    Back_I = 0
  EndIf
  
  ProcedureReturn Back_I.i
EndProcedure
Procedure.i gA_RouletteWheelElement(List genAlgListAll.genAlgSelect(), Elitism_I.i) ; returns the pointer to the Element corresponding to the cummulated value
  Back_I.i
  *i.Integer
  Kumm_average_I.i
  NewList TempGenAlg.i()
  
  If Elitism_I = 0 Or Elitism_I > 100; all have the chance of their score
    Value_I = Random(Kummulated_GI)
    
    ForEach genAlgListAll()
      If Value_I <= genAlgListAll()\AverageKumm_I And Value_I >= genAlgListAll()\AverageKumm_I - genAlgListAll()\Average_I
        Back_I = @genAlgListAll()
        Break
      EndIf
    Next
    
  Else ; only the best Elitism_I % has a chance
    n = (ListSize(genAlgListAll()) * Elitism_I) / 100 - 1
    
    FirstElement(genAlgListAll()) ; ceate the elitism list
    For x = 0 To n
      AddElement(TempGenAlg())
      TempGenAlg() = genAlgListAll()\Average_I
      Kumm_average_I + genAlgListAll()\Average_I
      NextElement(genAlgListAll())
    Next
    
    Value_I = Random(Kumm_average_I) ; what element
    
    x = 0
    ForEach TempGenAlg() ; search element
      If Value_I <= x And Value_I >= x + TempGenAlg()
        SelectElement(genAlgListAll(), ListIndex(TempGenAlg()))
        Back_I = @genAlgListAll()
        Break
      EndIf
      x + TempGenAlg()
    Next
  EndIf
  
  ProcedureReturn Back_I
EndProcedure

Procedure gA_Crossover(*gA_Child.BasisgenAlg, *gA_Parent.BasisgenAlg, Offset_I.i, Size_I.i) ; do the gene crossover, Part = % of genes from Mem2  ; part=-1 = random
  n.i = Offset_I
  b.i = Size_I
  
  For x = 1 To *gA_Parent\HiddenLayer_I ; hidden layers
    m = MemorySize(*gA_Parent\HiddenLayerWeight_P[x-1])
    If Offset_I = -1 ; random offset
      n = ((m - Size_I) * Random(100)) / 100
    EndIf
    If Size_I = -1
      b = m - n
    EndIf
    CopyMemory(*gA_Parent\HiddenLayerWeight_P[x-1] + n, *gA_Child\HiddenLayerWeight_P[x-1] + n, b)
  Next
  
  m = MemorySize(*gA_Parent\OutputWeight_P)
  If Offset_I = -1 ; random offset
    n = ((m - Size_I) * Random(100)) / 100
  EndIf
  If Size_I = -1
    b = m - n
  EndIf
  CopyMemory(*gA_Parent\OutputWeight_P + n, *gA_Child\OutputWeight_P + n, b)
  
EndProcedure
Procedure gA_Mutation(*Mem.i, MutationRate_F.f) ; do the gene crossover, Part = % of genes from Mem2  ; part=-1 = random
  *l.Long
  
  If MutationRate_F > 0 ; if there is a Mutation at all
    n = MemorySize(*Mem)>>2 - 1
    For x = 0 To n ; check for each long
      If MutationRate_F * 10000 >= Random(10000)
        *l = *Mem + x<<2
        *l\l ! (1 << Random(32)) ; XOR
      EndIf
    Next
  EndIf
EndProcedure
Procedure gA_Create_Child_genAlg(*gA_Child.BasisgenAlg, SelectionMode_I.i, Elitism_I.i, CrossoverMode_I.i, Crossoverrate_F.f, MutationRate_F.f) ; creates a new child-genAlg 
  ; Crossover mode: 0=random  1=single point  2=double point  3=multi point  4=scamble
  ; Selection Mode: 0= random  1=roulettewheel
  ; Elitism: 0=all genoms  100% = only best scored genom used for selection
  ; MutationRate_F  between 0 and 1
  ; Crossoverrate_F  between 0 and 1
  
  Parent1.BasisgenAlg ; var of the parent genom
  Parent2.BasisgenAlg ; var of the parent genom
  *genAlgList.genAlgSelect
  OK_I.i = 1
  *l.Long
  
  ; alloc the mem for the genoms
  CopyMemory(*gA_Child, @Parent1, SizeOf(Integer) * 6) ; transfer the basic settings
  CopyMemory(*gA_Child, @Parent2, SizeOf(Integer) * 6) 
  gA_AllocMem(@Parent1) 
  gA_AllocMem(@Parent2)
  
  n = ListSize(genAlgListAll())
  If n > 0 ; only if there are some saved genoms
    ;{ select the parents
    Select SelectionMode_I
      Case 0 ; random
        idx = Random(n-1)
        SelectElement(genAlgListAll(), idx)
        gA_Load_genAlg(genAlgListAll()\Path_S, @Parent1)
        
      Case 1 ; roulettewheel
        *genAlgList = gA_RouletteWheelElement(genAlgListAll(), Elitism_I)
        If *genAlgList
          gA_Load_genAlg(*genAlgList\Path_S, @Parent1)
        Else
          gA_RandomizeWeights(*gA_Child)
          OK_I = 0
        EndIf
        
    EndSelect
    
    ; parent 2
    Repeat
      Select SelectionMode_I
        Case 0 ; random
          idx = Random(n-1)
          SelectElement(genAlgListAll(), idx)
          gA_Load_genAlg(genAlgListAll()\Path_S, @Parent2)
          
        Case 1 ; roulettewheel
          *genAlgList = gA_RouletteWheelElement(genAlgListAll(), Elitism_I)
          If *genAlgList
            gA_Load_genAlg(*genAlgList\Path_S, @Parent2)
          Else
            gA_RandomizeWeights(*gA_Child)
            OK_I = 0
          EndIf
          
      EndSelect
      
    Until @Parent1 <> @Parent2
    ;}
    
    If OK_I
      ;{ Crossover - Genom
      ; the child get the gens of parent0
      For x = 1 To Parent1\HiddenLayer_I ; hidden layers
        CopyMemory(Parent1\HiddenLayerWeight_P[x-1], *gA_Child\HiddenLayerWeight_P[x-1], MemorySize(Parent1\HiddenLayerWeight_P[x-1]))
      Next
      CopyMemory(Parent1\OutputWeight_P, *gA_Child\OutputWeight_P, MemorySize(Parent1\OutputWeight_P))
      *gA_Child\genAlgStrenght_I = 0 ; reset the strenght
      
      If Crossoverrate_F <> 0.0
        ; select the Crossover-mode
        If CrossoverMode_I = 0 ; random
          a = Random(2) ; choose a mode at random 
        Else
          a = CrossoverMode_I
        EndIf 
        
        ; doing the crossover
        Select a
          Case 1 ; single point
            If Random(100) <= Crossoverrate_F * 100
              gA_Crossover(*gA_Child, @Parent2, -1, -1) ; random point, paste till end
            EndIf
            
          Case 2 ; double point
            If Random(100) <= Crossoverrate_F * 100
              gA_Crossover(*gA_Child, @Parent2, Random(GenomSize_I>>1), 8)
              gA_Crossover(*gA_Child, @Parent2, GenomSize_I>>1 + Random(GenomSize_I>>1), -1)
            EndIf
            
          Case 3 ; multi point
            If Random(100) <= Crossoverrate_F * 100
              For z = 0 To Crossoverrate_F * 10
                a = Random(GenomSize_I>>2 - 1)
                gA_Crossover(*gA_Child, @Parent2, a<<2, 4)
              Next
            EndIf
            
          Case 4 ; scamble
            If Random(100) <= Crossoverrate_F * 100
              For z = 0 To Crossoverrate_F * 10
                a = Random(GenomSize_I - 1)
                gA_Crossover(*gA_Child, @Parent2, a, 4)
              Next
            EndIf
            
        EndSelect
      EndIf
      ;}
      
      ; Mutation
      If MutationRate_F > 0 ; if there is a Mutation at all
        For x = 1 To *gA_Child\HiddenLayer_I ; hidden layers
          gA_Mutation(*gA_Child\HiddenLayerWeight_P[x-1], MutationRate_F)
        Next
        gA_Mutation(*gA_Child\OutputWeight_P, MutationRate_F)
      EndIf 
    EndIf
    
  Else ; else -> random
    gA_RandomizeWeights(*gA_Child)
    
  EndIf
  
EndProcedure
And finally, this is the reference image I've used (you've got to save it as "refimage.bmp" in the work directory of the program):
Image
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

Post by Hurga »

First of all, the image is nice </(


Second, currently I´m a bit short on zime, maybe I can have a look at your code later...

Third the author of the mona lisa did a slightly different approach, he creates only 2 AI´s and duplicate the better one and then mutate them.


I changed the include a bit to make it easier to use an other way to evolve the genes. I pu an other example of KI use here to extrapolation of stock prizes. (Not really good commented, sometimes in german, maybe you can make any use of it)

http://purebasic.stdojedmahr.de/schnips ... rithm2.zip

here is the different way used in the example code...

Code: Select all

    Max.q = -9223372036854775808  
    *best_gA.BasisgenAlg = 0
    ForEach KI_List()
      ; Debug KI_List()\genAlgStrenght_I
      If KI_List()\genAlgStrenght_I > Max And KI_List()\genAlgStrenght_I <= 0
        Max = KI_List()\genAlgStrenght_I
        *best_gA = @KI_List()
      EndIf
    Next
    ; Debug 
    BestScore = *best_gA\genAlgStrenght_I
    If BestScore > Overallscore_I And BestScore <= 0
      Overallscore_I = BestScore
      gA_Save_genAlg("C:\temp\KITraining\" + StrFx(BestScore / (MaxRounds * Prognosetest), 2) + "  Kursprog " + Str(*best_gA\HiddenLayer_I) + "." + Str(*best_gA\HiddenLayerNeurons_I) + "  Gen " + Str(x) + ".ki", *best_gA)
    EndIf
    
    ForEach KI_List()
      If KI_List() <> *best_gA
        ; Debug KI_List()\genAlgStrenght_I
        ; Debug @KI_List()
        ; gA_Clone_genAlg(*best_gA, @KI_List(), 0) ; clone the Best KI
        gA_Mutation(@KI_List(), 0.01) ; mutate the new KI´s
      EndIf
      KI_List()\genAlgStrenght_I = 0 ; score zurücksetzen
    Next
Mainly the mechanism works this way

- find the best scored AI
- save it
- clone it and replace the other AI´s
- mutate them all

maybe this way it will work better.

And at last... it tokks about 1 000 000 generations to create this mona lisa, so 300 ... maybe a bit more would improve the result :wink:
Kelebrindae
Enthusiast
Enthusiast
Posts: 151
Joined: Tue Apr 01, 2008 3:23 pm

Post by Kelebrindae »

Hi, Hurga.
I didn't test your latest code yet, but I think there may be a small bug in the "gA_Init_Reproduction" procedure of your first include:
In this proc, when you reload all the previous AIs, you deduce the score of a given AI from its file name, like this:

Code: Select all

genAlgListAll()\Average_I = Val(StringField(DirectoryEntryName(0), 1, "-")) ; the score of the genAlg
But the score isn't the first token of the file name, it's the second one (file names are constructed like this: "[generation number]-[score].[extension]").
So I think the code should be:

Code: Select all

genAlgListAll()\Average_I = Val(StringField(DirectoryEntryName(0), 2, "-")) ; the score of the genAlg
Am I right?
Hurga
Enthusiast
Enthusiast
Posts: 148
Joined: Thu Jul 17, 2003 2:53 pm
Contact:

Post by Hurga »

Jepp, you are... mea culpa :oops:
Kelebrindae
Enthusiast
Enthusiast
Posts: 151
Joined: Tue Apr 01, 2008 3:23 pm

Post by Kelebrindae »

Another question (yeah, I can be very troublesome... That's because I'm really eager to use it :wink: ) :

In the "gA_Init_Reproduction" procedure, there's the following loop:

Code: Select all

  For x = 0 To Epoche_I
  If ExamineDirectory(0, Path_S, "*.*")
    [...look for every AI files with score > 0...]
            If Val(StringField(DirectoryEntryName(0), 2, "-")) > 0
              AddElement(genAlgListAll())
              genAlgListAll()\Path_S = Path_S + DirectoryEntryName(0)
              genAlgListAll()\Average_I = Val(StringField(DirectoryEntryName(0), 2, "-")) ; the score of the genAlg
              Kumm_average_I + genAlgListAll()\Average_I
              genAlgListAll()\AverageKumm_I = Kumm_average_I
            EndIf
    [...]
  Next 
"Epoche_I" is the the current generation number => At the first generation each AI is added to the list "genAlgListAll()" once, at the second generation they are added twice, at the third, they're added thrice, etc..
I'm a bit confused: what's the point of having the same AIs several times in the list (once per generation) ? :?

Thanks!
Post Reply