Page 1 of 2

Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 6:13 am
by PureLeo
Hi...

I have this code,

Code: Select all

Define.q

size=FileSize("teste")

ReadFile(0,"teste")
CreateFile(1,"teste2")

*a=AllocateMemory(size)

ReadData(0,*a,size)
WriteData(1,*a,size)

FlushFileBuffers(1)
CloseFile(1)
CloseFile(0)

Debug FileSize("teste")
Debug FileSize("teste2")
It says "Can't allocate a memory block with a negative size" when the file is too big, like +- 4gb (amount of RAM I have)

Ofc this is just a test code, but I intend to make something like a file splitter, is there another way to read/write data without using a buffer, or any other way to do that?

Thanks in advance!

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 6:45 am
by netmaestro
*a=AllocateMemory(size)
ReadData(0,*a,size)
Better is:

Code: Select all

*a=AllocateMemory(size)
If *a
  ; go to town
Else
  *a = AllocateMemory(size/2)
Endif
If *a
  ...etc.
Endif
Work with whatever chunksize you can handle. But test first to find out what size that is.

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 7:16 am
by infratec
Hi,

try this:

Code: Select all

#FileCopyBufferSize = 10240

Procedure FileCopy(Source$, Destination$)
 
  Protected Result
  Protected ReadFile, WriteFile
  Protected *Buffer
  Protected StartTime, EndTime
  Protected Size
  
  Result = -1
  
  ReadFile = ReadFile(#PB_Any, Source$)
  If ReadFile
    WriteFile = CreateFile(#PB_Any, Destination$)
    If WriteFile
      
      *Buffer = AllocateMemory(#FileCopyBufferSize)
      If *Buffer
        
  ;      FileBuffersSize(ReadFile, #FileCopyBufferSize)
  ;      FileBuffersSize(WriteFile, #FileCopyBufferSize)
        
        StartTime = ElapsedMilliseconds()
        
        While Not Eof(ReadFile)
          Size = ReadData(ReadFile, *Buffer, #FileCopyBufferSize)
          WriteData(WriteFile, *Buffer, Size)
        Wend
        
        EndTime = ElapsedMilliseconds()
        Result = EndTime - StartTime
        
        FreeMemory(*Buffer)
      EndIf
      CloseFile(WriteFile)
    EndIf
    CloseFile(ReadFile)
  EndIf
  
  ProcedureReturn Result
  
EndProcedure

Debug FileCopy("test1", "test2")

Debug FileSize("test1")
Debug FileSize("test2")
So also other runing programs will still have RAM available :D

Try what happens if you uncomment the FileBuffersSize(). (Don't increase it toooo much)
You can also play with the #BufferSize itself.

Bernd

P.S.:
If you test it on a network drive, the bottleneck is your network speed and not the RAM :mrgreen:

P.S.S.:
Why you don't use CopyFile() :?:

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 6:15 pm
by PureLeo
Hi! Thanks for your replies!

Maybe you didn't understand me :oops:

Actually I don't want to copy files, I want to split a VERY big file like 4gb+ into small files...
But for that I need ReadData() and for this I need a large buffer.... but when I try to allocate memory for 4gb, I get an error saying I can't allocate memory of negative size, if I try to compile it in 32bits, but it seems to work when I run it in 64bits

It just doesn't let me do this in 32bits:

Code: Select all

*a=AllocateMemory(4*1024*1024*1024)
It seems it is not possible to read blocks of data from the specified file with ReadData() either, or else I could for instance, read from 0 to 1000, then 1001 to 2000, 2001 to 3000 and so on...

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 6:35 pm
by moogle
PureLeo wrote:It just doesn't let me do this in 32bits:

Code: Select all

*a=AllocateMemory(4*1024*1024*1024)
I seems it is not possible to read blocks of data from the specified file with ReadData() either, or else I could for instance, read from 0 to 1000, then 1001 to 2000, 2001 to 3000 and so on...
I'm guessing that's because a 32 bit program can't access that much memory (not saying you don't have enough) but it's the limitation in windows (I assume this is windows) itself. I have a program called memtest that is 32 bit and can only access 2.5gb. After I patched it with a memory patch I found on the internet it can access 3.3gb.

I think you should have a search about 32 bit programs accessing large amounts of memory.


You can read blocks by using an offset + *Memory pointer I think.

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 6:43 pm
by spikey
You don't need to read the whole file into memory before starting to write it out again. Work with it in chunks...

Work out how big a chunk you can afford to work with - or want to work with or whatever. Then use a quad as a file location pointer - incrementing this by the chunk size. Then use FileSeek to set the file position to read from. Read in the chunk, write it back out to the destination file before going round again...

See FileSeek in the online help for an example.

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 7:11 pm
by netmaestro
Here's a test I knocked together for you. Before I ran the code the only files in the folder were armagedn.iso and testsplit.pb:

Code: Select all

OpenFile(0, "armagedn.iso")
CreateFile(1, "armagednpart.001")
*mem = AllocateMemory(100*1024*1024)
written = 0
cc = 1
While Not Eof(0)
  While written < 500*1024*1024
    thisread = ReadData(0, *mem, MemorySize(*mem))
    WriteData(1, *mem, thisread)
    written + thisread
    If thisread < MemorySize(*mem)
      Break
    EndIf
  Wend
  CloseFile(1)
  If thisread = MemorySize(*mem) ; There is more data to read
    written = 0
    cc+1
    CreateFile(1, "armagednpart." + RSet(Str(cc),3, "0"))
  EndIf
Wend
CloseFile(0)
Now the folder looks like this:

Image

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 7:26 pm
by infratec
Oh,

I'm to late :oops:

Code: Select all

#FileCopyBufferSize = 10240

Procedure SplitFile(Source$, PartSize.q)

  Protected Result
  Protected PartCounter
  Protected ReadFile, WriteFile
  Protected *Buffer
  Protected StartTime, EndTime
  Protected Size
  
  Result = -1
  
  ReadFile = ReadFile(#PB_Any, Source$)
  If ReadFile
    
    PartCounter = 0
    
    WriteFile = CreateFile(#PB_Any, Source$ + "." + RSet(Str(PartCounter), 3, "0"))
    If WriteFile
      
      If #FileCopyBufferSize > PartSize
        BufferSize = PartSize
      Else
        BufferSize = #FileCopyBufferSize
      EndIf
      
      *Buffer = AllocateMemory(BufferSize)
      If *Buffer
        
;        FileBuffersSize(ReadFile, BufferSize)
;        FileBuffersSize(WriteFile, BufferSize)
       
        StartTime = ElapsedMilliseconds()
        
        WriteSize.q = 0
        ChunkSize = BufferSize
        While Not Eof(ReadFile)
          If WriteSize + ChunkSize > PartSize
            ChunkSize = PartSize - WriteSize
          Else
            ChunkSize = BufferSize
          EndIf
          Size = ReadData(ReadFile, *Buffer, ChunkSize)
          WriteData(WriteFile, *Buffer, Size)
          WriteSize + Size
          If WriteSize = PartSize
            CloseFile(WriteFile)
            PartCounter + 1
            WriteFile = CreateFile(#PB_Any, Source$ + "." + RSet(Str(PartCounter), 3, "0"))
            WriteSize = 0
          EndIf
          
        Wend
       
        EndTime = ElapsedMilliseconds()
        Result = EndTime - StartTime
       
        FreeMemory(*Buffer)
      EndIf
      CloseFile(WriteFile)
    EndIf
    CloseFile(ReadFile)
  EndIf
 
  ProcedureReturn Result
 
EndProcedure

Debug SplitFile("test1", FileSize("test1") / 10)
But that's my version.
More complicated.

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 7:40 pm
by PureLeo
I thought I couldn't use an offset to read data from some point with ReadData( )

Netmaestro your example shows that ReadData() keeps reading from where it stopped, instead of reading all again from the begining, thank you!

Thank you all for your time :)

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 11:24 pm
by tinman
@PureLeo: AllocateMemory() takes an integer for it's size, so on a 32bit system anything more than 2Gb will be treated as a negative value because PB uses signed types.

@netmaestro: when trying to allocate a large buffer (in your earlier post) wouldn't you be better checking the amount of free memory before trying to allocate the whole size? I can successfully allocate 2GB of memory on this machine (Windows) for a copy buffer but so much other things get moved into swap that the system still slows to a crawl.

Re: Open File larger than the amount of ram you have

Posted: Fri Aug 05, 2011 11:28 pm
by netmaestro
wouldn't you be better checking the amount of free memory before trying to allocate the whole size?
Spot on, very good point. And the reason my coded solution takes a different form.

Re: Open File larger than the amount of ram you have

Posted: Sat Aug 06, 2011 5:33 am
by PureLeo
@tinman Yes, that's it...

At first I thought I was messing with the RAM limits, but you're right...
I also used "Define.q" hoping to solve the issue but ofc, that didn't help :/

Re: Open File larger than the amount of ram you have

Posted: Sat Aug 06, 2011 5:41 am
by netmaestro
When I started programming, if you could successfully allocate 8k of ram you were on cloud 9. How things have changed, I'm still trying to work out if it's for the better or not :?

Twitter, Facebook, porn.. beam me up, Scotty :shock:

Re: Open File larger than the amount of ram you have

Posted: Sat Aug 06, 2011 6:06 am
by PureLeo
My Professor mentioned that when he was talking about the importance of pointers :D I wasn't born at that time...

Re: Open File larger than the amount of ram you have

Posted: Mon Aug 08, 2011 1:06 pm
by El_Choni
Hi,

I think maybe file mapping would be the way to go when handling very large files - in Windows, at least.

http://msdn.microsoft.com/en-us/library ... 85%29.aspx