Page 1 of 1

Memory allocation - problems when used ram greater 1.2 GB

Posted: Sun Jul 22, 2007 11:18 am
by JoRo
I am the developer of GeoControl (terrain generator, written in PB).
More and more users (mostly game developers) are asking for terrains greater than 4096*4096 points. After building in the possibility for 8192*8192 terrains into GC, memory allocations errors occured.

To keep the ram below 1.5 GB I did many "Global Dims", arrays to zero and back to full size, permanently during execution. It is sure, that at all no more than 1.2 GB ram were needed. Nethertheless windows run into problems and failed to manage the memory, with randomly allocation errors.

Did someone here know the reason? I know from many application, not written in PB like Vue or Carrara, that they have the same problem.

Would the memory usage change, if someone run GeoControl on a 64bit system, or does the emulation to 32bit also change the internal memory management of windows to 32 bit?

Request for large terrains will grow fast, as the new game engines can handle 32768*32768 maps for shading.

Posted: Sun Jul 22, 2007 12:11 pm
by Techie42
Hi JoRo,

My basic understanding of the Windows memory model (32-bit) is as follows:

Maximum 32-bit address space = 2 ^ 32 = 4GB

But, MS Windows reserves the upper 2GB, reducing the available memory down to 2GB.

This remaining 2GB needs to accommodate the system routines/services, libraries (DLLs), heap and stack.

Therefore, the larger the requirements of the heap/stack/etc..., there is less address space available for general use. So, if 500MB is used for heap/stack/DLLs, etc..., that leaves around 1.5GB for dynamic allocation.

This might explain why you are now hitting limits.

I believe that in a 64-bit environment, as the address range is greater, the limits increase accordingly. However, I believe there are still limits, but I'm not sure what they are.

There are other people far more clever than I that can probably give you a lot more clarification...won't be long before they post :)

Posted: Sun Jul 22, 2007 12:26 pm
by Techie42
Hi again,

I just thought this article might be useful:

http://msdn.microsoft.com/msdnmag/issues/0700/hood/

It is a little dated (2000), but it gives a lot of info on the Windows memory management model.

Sorry if it is a bit heavy.

Posted: Sun Jul 22, 2007 12:29 pm
by Kaeru Gaman
Hi JoRo

you didn't mention the physical memory...
I guess the machine should have 2GB RAM to avoid a lot of swapping when the program needs 1.2GB
if you allocate and free too fast and need a lot of swapping while this,
I can frankly imagine windows failing to do it always right.

I don't know about a 64bit environment, if a 32bit prog could be run there in a full 4GB segment....

if you are heading forward to the really big maps
> the new game engines can handle 32768*32768 maps for shading.
you should convert the whole project to 64bit to be on the safe site.

Posted: Sun Jul 22, 2007 3:15 pm
by PureLust
Hi JoRo,

I've written a quick Test-Code which reproduces this Problem.

I'm not sure, but afaik windows swaps only full areas of an allocated block - it doesn't splitt these Blocks to swap them to HDD.
So it works with smaller Memory-blocks quite reliable, but doesn't work with large ones.

As you can see in the Testcode, PB cannot allocate the memory for the Dim-Field if the required Memory-Block is larger than the physically free Memory in your System (e.g. you can see the physically free Memory in the Taskmanager).

Code: Select all

Structure BigStruct
	BigSpace.l[25600]		; Will reserve 100KB for each structured Element
EndStructure

For n = 5000 To 16000 Step 1000		; This Loop will reserve between 500 MB and 1.6 GB of Ram in 100 MB Steps
	Debug "Trying to allocate "+Str(n * 102400)+" Bytes."
	Global Dim Test.BigStruct(0)
	Global Dim Test.BigStruct(n)
	Debug "Memoryadress of the Array: "+Str(@Test()) : Debug ""
	Test(n-1)\BigSpace[25599] = n
	MessageRequester("Memoryallocation by Dim() Test",Str(n * 102400)+" Bytes allocated by 'Global Dim'."+#CRLF$+"Testvalue: "+Str(Test(n-1)\BigSpace[25599]))
Next n
With a free physically Memory of about 780 MB, the created Debugger-Output will be like this:
Debugger wrote:Trying to allocate 512000000 Bytes.
Memoryadress of the Array: 23199796

Trying to allocate 614400000 Bytes.
Memoryadress of the Array: 537460788

Trying to allocate 716800000 Bytes.
Memoryadress of the Array: 537460788

Trying to allocate 819200000 Bytes.
Memoryadress of the Array: 0
Greets, PL.

Posted: Sun Jul 22, 2007 3:56 pm
by Fred
May be it's due to memory fragmentation, as you request a single block of memory which needs to be contigous. If you can allocate it in several chunks, it could may be solve this problem.

Posted: Sun Jul 22, 2007 4:38 pm
by PureLust
Hmmm .... just changed the above code to increase the the Array-Size step-by-step using ReDim().
First it looked like the memory-swapping to HDD will work.
But increasing the Array-size that way seems to defragment the Memory very much and ends up with an "invalid memory acces" from ReDim().

Code: Select all

Structure BigStruct
	BigSpace.l[25600]		; Will reserve 100KB for each structured Element
EndStructure

For n = 5000 To 16000 Step 1000		; This Loop will reserve between 500 MB and 1.6 GB of Ram in 100 MB Steps
	Debug "Trying to allocate "+Str(n / 10)+" MByte." : Debug ""
	Global Dim Test.BigStruct(0)
	For m = 1000 To n Step 1000
		ReDim Test.BigStruct(m)
	Debug "Allocating Block "+Str(m/1000)+" - Arraysize is now "+Str(m / 10)+" MByte"
	Debug "Memoryadress of the Array: "+Str(@Test())
	Next m
	Debug ""
	Test(n-1)\BigSpace[25599] = n
	MessageRequester("Memoryallocation by Dim() Test",Str(n /100)+" MByte allocated by 'Global Dim'."+#CRLF$+"Testvalue: "+Str(Test(n-1)\BigSpace[25599]))
Next n
[Edit] @Fred:
On my System ReDim() generates an IMA while trying to allocate 600 MB.
Just before ReDim() generates an IMA, the memoryusage of this process growes up to over 800 MB (observed with TaskManager) - even though ReDim() should only increase the Size to 6000 Fields (600MB).
Could it be, that there is a Bug in ReDim()?

Posted: Sun Jul 22, 2007 4:59 pm
by JoRo
In some cases splitting the memory would be possible, but not in all. The single block/array size is about 260 MB (8192*8192*4).
I am thinking about a tiling, so that the maximum size would be 4192*4192*4, which runs rock stable, but that is only possible, with swapping to HDD and back, which will cost a lot more time. As time encreases by encreasing size, this is a problem. A terrain caculation for an erosion then may take half a day.

I hoped, that on a 64bit system, the internal memory addressing works with 64 bit, even if the application uses 32 bit.

Posted: Sun Jul 22, 2007 11:47 pm
by pdwyer
JoRo wrote: I hoped, that on a 64bit system, the internal memory addressing works with 64 bit, even if the application uses 32 bit.
I strongly doubt it, I would guess this is very much OS related (or maybe PB related I suppose). I wonder if the same thing happens on 2000/XP & Vista? Or a better test, does it happen on windows server 2003 where memory management for larger apps is better?

Page fault counters in perfmon may help you determine which methods are causing more swapping than others which for what you are doing sounds like an area of important performance optimising.

Posted: Mon Jul 23, 2007 8:49 pm
by Rescator
pdwyer wrote:
JoRo wrote: I hoped, that on a 64bit system, the internal memory addressing works with 64 bit, even if the application uses 32 bit.
I strongly doubt it, I would guess this is very much OS related (or maybe PB related I suppose).
You need a 64bit program on a 64bit OS to take advantage of 2GB+ memory.
JoRo wrote:I am the developer of GeoControl (terrain generator, written in PB).
More and more users (mostly game developers) are asking for terrains greater than 4096*4096 points. After building in the possibility for 8192*8192 terrains into GC, memory allocations errors occured.
You need to rethink your memory model/use.
Also. (8192*8192)*4 (*4 bytes is i.e. 32bit)
Sums up to 268435456 bytes. Not exactly 1.2GB is it?


My advice is store the whole thing on disk instead as a virtual bitmap in a way, and only load into memory the parts that are displayed/used.
Load them in form of tile sections. Sure it's a bit slower, but you no longer have any size limits this way. Besides available disk space that is :)

Posted: Mon Jul 23, 2007 8:55 pm
by Kaeru Gaman
> My advice is store the whole thing on disk
but this will be slow

> A terrain caculation for an erosion then may take half a day.
seems to be no alternative....

after all, the project seems to be dedicated for 64bit allover.

Posted: Mon Jul 23, 2007 9:04 pm
by Rescator
Most Geo projects like that tend to run on specifically setup hardware and software.
Trying to do it on a "desktop" computer will always be a pain and slow sadly.

A 64bit app on a 64bit OS will obviously work better, but that too is a desktop system you will still get performance and memory issues.
Even 16GB memory may not be enough if things grow really big.
You need (if you are to be "future proof") deal with several TB of data.

Try googling for some source codes out there, you might get a nice tip or two on disk caching stuff like this.

Sorry that there is no easy or magic solution :)

Posted: Mon Jul 23, 2007 11:30 pm
by blueznl
filemapping could be of help perhaps in this case? though you still would run into OS filesystem limits

Posted: Mon Jul 23, 2007 11:58 pm
by Rescator
That depends on how filemapping works. If it's Windows 5.x or 6.x then 64bit filesizes should work ok, I assume filemapping behaves in a similar way right but with the bonus of a OS cache/load implemenation though? I like it, good idea blueznl :)