Open File larger than the amount of ram you have

Just starting out? Need help? Post your questions and find answers here.
User avatar
PureLeo
Enthusiast
Enthusiast
Posts: 221
Joined: Fri Jan 29, 2010 1:05 pm
Location: Brazil

Open File larger than the amount of ram you have

Post by PureLeo »

Hi...

I have this code,

Code: Select all

Define.q

size=FileSize("teste")

ReadFile(0,"teste")
CreateFile(1,"teste2")

*a=AllocateMemory(size)

ReadData(0,*a,size)
WriteData(1,*a,size)

FlushFileBuffers(1)
CloseFile(1)
CloseFile(0)

Debug FileSize("teste")
Debug FileSize("teste2")
It says "Can't allocate a memory block with a negative size" when the file is too big, like +- 4gb (amount of RAM I have)

Ofc this is just a test code, but I intend to make something like a file splitter, is there another way to read/write data without using a buffer, or any other way to do that?

Thanks in advance!
User avatar
netmaestro
PureBasic Bullfrog
PureBasic Bullfrog
Posts: 8451
Joined: Wed Jul 06, 2005 5:42 am
Location: Fort Nelson, BC, Canada

Re: Open File larger than the amount of ram you have

Post by netmaestro »

*a=AllocateMemory(size)
ReadData(0,*a,size)
Better is:

Code: Select all

*a=AllocateMemory(size)
If *a
  ; go to town
Else
  *a = AllocateMemory(size/2)
Endif
If *a
  ...etc.
Endif
Work with whatever chunksize you can handle. But test first to find out what size that is.
BERESHEIT
infratec
Always Here
Always Here
Posts: 7577
Joined: Sun Sep 07, 2008 12:45 pm
Location: Germany

Re: Open File larger than the amount of ram you have

Post by infratec »

Hi,

try this:

Code: Select all

#FileCopyBufferSize = 10240

Procedure FileCopy(Source$, Destination$)
 
  Protected Result
  Protected ReadFile, WriteFile
  Protected *Buffer
  Protected StartTime, EndTime
  Protected Size
  
  Result = -1
  
  ReadFile = ReadFile(#PB_Any, Source$)
  If ReadFile
    WriteFile = CreateFile(#PB_Any, Destination$)
    If WriteFile
      
      *Buffer = AllocateMemory(#FileCopyBufferSize)
      If *Buffer
        
  ;      FileBuffersSize(ReadFile, #FileCopyBufferSize)
  ;      FileBuffersSize(WriteFile, #FileCopyBufferSize)
        
        StartTime = ElapsedMilliseconds()
        
        While Not Eof(ReadFile)
          Size = ReadData(ReadFile, *Buffer, #FileCopyBufferSize)
          WriteData(WriteFile, *Buffer, Size)
        Wend
        
        EndTime = ElapsedMilliseconds()
        Result = EndTime - StartTime
        
        FreeMemory(*Buffer)
      EndIf
      CloseFile(WriteFile)
    EndIf
    CloseFile(ReadFile)
  EndIf
  
  ProcedureReturn Result
  
EndProcedure

Debug FileCopy("test1", "test2")

Debug FileSize("test1")
Debug FileSize("test2")
So also other runing programs will still have RAM available :D

Try what happens if you uncomment the FileBuffersSize(). (Don't increase it toooo much)
You can also play with the #BufferSize itself.

Bernd

P.S.:
If you test it on a network drive, the bottleneck is your network speed and not the RAM :mrgreen:

P.S.S.:
Why you don't use CopyFile() :?:
User avatar
PureLeo
Enthusiast
Enthusiast
Posts: 221
Joined: Fri Jan 29, 2010 1:05 pm
Location: Brazil

Re: Open File larger than the amount of ram you have

Post by PureLeo »

Hi! Thanks for your replies!

Maybe you didn't understand me :oops:

Actually I don't want to copy files, I want to split a VERY big file like 4gb+ into small files...
But for that I need ReadData() and for this I need a large buffer.... but when I try to allocate memory for 4gb, I get an error saying I can't allocate memory of negative size, if I try to compile it in 32bits, but it seems to work when I run it in 64bits

It just doesn't let me do this in 32bits:

Code: Select all

*a=AllocateMemory(4*1024*1024*1024)
It seems it is not possible to read blocks of data from the specified file with ReadData() either, or else I could for instance, read from 0 to 1000, then 1001 to 2000, 2001 to 3000 and so on...
Last edited by PureLeo on Fri Aug 05, 2011 6:59 pm, edited 1 time in total.
moogle
Enthusiast
Enthusiast
Posts: 372
Joined: Tue Feb 14, 2006 9:27 pm
Location: London, UK

Re: Open File larger than the amount of ram you have

Post by moogle »

PureLeo wrote:It just doesn't let me do this in 32bits:

Code: Select all

*a=AllocateMemory(4*1024*1024*1024)
I seems it is not possible to read blocks of data from the specified file with ReadData() either, or else I could for instance, read from 0 to 1000, then 1001 to 2000, 2001 to 3000 and so on...
I'm guessing that's because a 32 bit program can't access that much memory (not saying you don't have enough) but it's the limitation in windows (I assume this is windows) itself. I have a program called memtest that is 32 bit and can only access 2.5gb. After I patched it with a memory patch I found on the internet it can access 3.3gb.

I think you should have a search about 32 bit programs accessing large amounts of memory.


You can read blocks by using an offset + *Memory pointer I think.
Image
User avatar
spikey
Enthusiast
Enthusiast
Posts: 750
Joined: Wed Sep 22, 2010 1:17 pm
Location: United Kingdom

Re: Open File larger than the amount of ram you have

Post by spikey »

You don't need to read the whole file into memory before starting to write it out again. Work with it in chunks...

Work out how big a chunk you can afford to work with - or want to work with or whatever. Then use a quad as a file location pointer - incrementing this by the chunk size. Then use FileSeek to set the file position to read from. Read in the chunk, write it back out to the destination file before going round again...

See FileSeek in the online help for an example.
User avatar
netmaestro
PureBasic Bullfrog
PureBasic Bullfrog
Posts: 8451
Joined: Wed Jul 06, 2005 5:42 am
Location: Fort Nelson, BC, Canada

Re: Open File larger than the amount of ram you have

Post by netmaestro »

Here's a test I knocked together for you. Before I ran the code the only files in the folder were armagedn.iso and testsplit.pb:

Code: Select all

OpenFile(0, "armagedn.iso")
CreateFile(1, "armagednpart.001")
*mem = AllocateMemory(100*1024*1024)
written = 0
cc = 1
While Not Eof(0)
  While written < 500*1024*1024
    thisread = ReadData(0, *mem, MemorySize(*mem))
    WriteData(1, *mem, thisread)
    written + thisread
    If thisread < MemorySize(*mem)
      Break
    EndIf
  Wend
  CloseFile(1)
  If thisread = MemorySize(*mem) ; There is more data to read
    written = 0
    cc+1
    CreateFile(1, "armagednpart." + RSet(Str(cc),3, "0"))
  EndIf
Wend
CloseFile(0)
Now the folder looks like this:

Image
BERESHEIT
infratec
Always Here
Always Here
Posts: 7577
Joined: Sun Sep 07, 2008 12:45 pm
Location: Germany

Re: Open File larger than the amount of ram you have

Post by infratec »

Oh,

I'm to late :oops:

Code: Select all

#FileCopyBufferSize = 10240

Procedure SplitFile(Source$, PartSize.q)

  Protected Result
  Protected PartCounter
  Protected ReadFile, WriteFile
  Protected *Buffer
  Protected StartTime, EndTime
  Protected Size
  
  Result = -1
  
  ReadFile = ReadFile(#PB_Any, Source$)
  If ReadFile
    
    PartCounter = 0
    
    WriteFile = CreateFile(#PB_Any, Source$ + "." + RSet(Str(PartCounter), 3, "0"))
    If WriteFile
      
      If #FileCopyBufferSize > PartSize
        BufferSize = PartSize
      Else
        BufferSize = #FileCopyBufferSize
      EndIf
      
      *Buffer = AllocateMemory(BufferSize)
      If *Buffer
        
;        FileBuffersSize(ReadFile, BufferSize)
;        FileBuffersSize(WriteFile, BufferSize)
       
        StartTime = ElapsedMilliseconds()
        
        WriteSize.q = 0
        ChunkSize = BufferSize
        While Not Eof(ReadFile)
          If WriteSize + ChunkSize > PartSize
            ChunkSize = PartSize - WriteSize
          Else
            ChunkSize = BufferSize
          EndIf
          Size = ReadData(ReadFile, *Buffer, ChunkSize)
          WriteData(WriteFile, *Buffer, Size)
          WriteSize + Size
          If WriteSize = PartSize
            CloseFile(WriteFile)
            PartCounter + 1
            WriteFile = CreateFile(#PB_Any, Source$ + "." + RSet(Str(PartCounter), 3, "0"))
            WriteSize = 0
          EndIf
          
        Wend
       
        EndTime = ElapsedMilliseconds()
        Result = EndTime - StartTime
       
        FreeMemory(*Buffer)
      EndIf
      CloseFile(WriteFile)
    EndIf
    CloseFile(ReadFile)
  EndIf
 
  ProcedureReturn Result
 
EndProcedure

Debug SplitFile("test1", FileSize("test1") / 10)
But that's my version.
More complicated.
User avatar
PureLeo
Enthusiast
Enthusiast
Posts: 221
Joined: Fri Jan 29, 2010 1:05 pm
Location: Brazil

Re: Open File larger than the amount of ram you have

Post by PureLeo »

I thought I couldn't use an offset to read data from some point with ReadData( )

Netmaestro your example shows that ReadData() keeps reading from where it stopped, instead of reading all again from the begining, thank you!

Thank you all for your time :)
User avatar
tinman
PureBasic Expert
PureBasic Expert
Posts: 1102
Joined: Sat Apr 26, 2003 4:56 pm
Location: Level 5 of Robot Hell
Contact:

Re: Open File larger than the amount of ram you have

Post by tinman »

@PureLeo: AllocateMemory() takes an integer for it's size, so on a 32bit system anything more than 2Gb will be treated as a negative value because PB uses signed types.

@netmaestro: when trying to allocate a large buffer (in your earlier post) wouldn't you be better checking the amount of free memory before trying to allocate the whole size? I can successfully allocate 2GB of memory on this machine (Windows) for a copy buffer but so much other things get moved into swap that the system still slows to a crawl.
If you paint your butt blue and glue the hole shut you just themed your ass but lost the functionality.
(WinXPhSP3 PB5.20b14)
User avatar
netmaestro
PureBasic Bullfrog
PureBasic Bullfrog
Posts: 8451
Joined: Wed Jul 06, 2005 5:42 am
Location: Fort Nelson, BC, Canada

Re: Open File larger than the amount of ram you have

Post by netmaestro »

wouldn't you be better checking the amount of free memory before trying to allocate the whole size?
Spot on, very good point. And the reason my coded solution takes a different form.
BERESHEIT
User avatar
PureLeo
Enthusiast
Enthusiast
Posts: 221
Joined: Fri Jan 29, 2010 1:05 pm
Location: Brazil

Re: Open File larger than the amount of ram you have

Post by PureLeo »

@tinman Yes, that's it...

At first I thought I was messing with the RAM limits, but you're right...
I also used "Define.q" hoping to solve the issue but ofc, that didn't help :/
User avatar
netmaestro
PureBasic Bullfrog
PureBasic Bullfrog
Posts: 8451
Joined: Wed Jul 06, 2005 5:42 am
Location: Fort Nelson, BC, Canada

Re: Open File larger than the amount of ram you have

Post by netmaestro »

When I started programming, if you could successfully allocate 8k of ram you were on cloud 9. How things have changed, I'm still trying to work out if it's for the better or not :?

Twitter, Facebook, porn.. beam me up, Scotty :shock:
BERESHEIT
User avatar
PureLeo
Enthusiast
Enthusiast
Posts: 221
Joined: Fri Jan 29, 2010 1:05 pm
Location: Brazil

Re: Open File larger than the amount of ram you have

Post by PureLeo »

My Professor mentioned that when he was talking about the importance of pointers :D I wasn't born at that time...
El_Choni
TailBite Expert
TailBite Expert
Posts: 1007
Joined: Fri Apr 25, 2003 6:09 pm
Location: Spain

Re: Open File larger than the amount of ram you have

Post by El_Choni »

Hi,

I think maybe file mapping would be the way to go when handling very large files - in Windows, at least.

http://msdn.microsoft.com/en-us/library ... 85%29.aspx
El_Choni
Post Reply