How do I scp extremely large files



I need to copy extremely large files (30-100G) from a remove server to
a machine where my code will be running. I have code in place now
that uses an sftp connection to scan the file directory watching for
files to show up for me to copy. My question is how do I read in
chunks of these large files and write them out in chunks? I cannot
hold the bytes in memory obviously. Any help or pseudo code is
greatly appreaciated!
.



Relevant Pages

  • Re: LINUX Server Reboot Frequency
    ... "P>> filesystem and find some appropriate flags associated with the Oracle ... "P>> memory leak, that you start swapping and things slow down. ... Large chunks are given back to the system. ... then restore the first server without that overflowing RAM ...
    (comp.os.linux.misc)
  • Re: Download question
    ... memory, and if I send chunks of data, the user won't be able to navigate ... >> but I'm using a hosting service, the web server is not a server that I ...
    (microsoft.public.dotnet.framework.aspnet)
  • Re: Microkernel X11?
    ... Sometimes (i.e. with the Via driver) X actually freezes to where it ... Yes but I meant the server being implemented as such, ... When the drawing thread ... Can't drawing processes lock chunks ...
    (comp.os.minix)
  • Re: file transfer with sockets
    ... confirmation from client for each chunk before sending the next? ... to server, server interprets command, executes it, then simply returns ... files (hence the reading in chunks). ... sending the bytes over the stream. ...
    (microsoft.public.dotnet.languages.csharp)
  • Re: random writing access to a file in Python
    ... the size of available memory for the sorting operation (making it ... possible to work on larger chunks of data in memory) has less impact ... In the second run the size of processed chunks between reading/writing was in the order of up to tenths of Megabytes, where in the first run in order of up to hundreds Megabytes. ... decision about the size of buffers for merge sorting the chunks into the final file, so that they all fit into the 300 MByte of used memory ...
    (comp.lang.python)