I'm working on an application that's intended to run concurrently by all users on a Windows Terminal Server so reducing memory consumption is very importend. That's why I wan't to destroy and recreate the child window instead of just hiding it.
If this is the final answer than it makes things complicated. It means I will have to stress memory to check if memory is freed up as expected to make sure my code has no memory leaks. As a Terminal Server easily contains 10GB of memory the memory consumption of the application can grow enormously. I don't think system engineers will be happy with this behaviour.
Just ran your program with both delays changed to 200 with task manager up and the memory used was going up and down at a pretty constant rate, how fast do you intend opening and closing windows because it looks like windows keeps releasing the memory quite constantly and shouldn't be a problem.
That's strange, in my case memory usage doen't go down when the window is closed. When I changed the delays to 200 like you did memory usage only increases with 4kB every 10 seconds.
It looks like I have a similar problem with 'Openlibrary' and 'CloseLibrary'. After calling 'CloseLibrary' only a part of the memory is released. When calling 'Openlibrary/CloseLibrary' repeadetly average memory usage increases. I don't understand why only a part of the memory is released.
The delays used in the example are set very short to demonstrate the memory issue. In practise there will be at least 10 minutes between the popups. However the combination of the CloseWindow-issue and the CloseLibrary-issue (and the fact that there will be 30 instances of the application running simultaneously) leads to a significant increase in memory usage.
I looks like all my attempts to reduce the memory footprint of my application have the opposite effect. I just changed the recreation of the popup window to hiding of the window and moved the code in the library to a procedure in my main program and now the initial memory footprint of the application is slightly higher but at least it remains constant.
I don't think you understand that the reported memory usage isn't what the application needs. It is just tagged as used because it's faster to leave it as used than to actually free it. So it's not freed for performance reasons. Now and then and when more memory is needed it is freed.
But it's good that it isn't freed at once because of performance.
PB's function's so far have been nice regarding cleanup ( the basics at least ).
But to clarify ( i think this has been discussed somewhere in the past ) how windows allocates and deallocates memory.
If you compile an exe which uses parts of external libs ( for ex user32.dll or otehr system libs ) - windows will reserve the memory for that lib instance which your exe is linked to. If you reserve memory in your program .. then deallocate it .. windows wont free it imediatelly - it will when its needed to ( eg ram running low etc ). You will also notice if you minimize your app and let it hang there minimized for a while memory usage also will go down . This is because windows clean's up its memory just under certain conditions ( exe closed - all instances of linked libs removed , exe minimized / memory full -> parts that still need memory will be moved to swap space and unregistered will be flushed ( deallocated ).
If you plan your code right you can clean it up easly also. So, assuming Cleanup works correct theres a few things you can do to keep the mem usage pretty much under control. For one is to use as few possible global or normal varibles as possible, use structures and pointers instead ( its faster too ) . That way you easly can clean up your memory by just freeing your structured var's or arrays or lists ( using alot of "floating" var's can become pretty hard to free as you have to do it for each var you dont need anymore ).
Thalius
"In 3D there is never enough Time to do Things right,
but there's always enough Time to make them *look* right." "psssst! i steal signatures... don't tell anyone! "