Page 1 of 1

Huge data transfer

Posted: Mon Apr 07, 2008 5:23 am
by thiscatis
Hi guys,


I was not sure where to put this... so mods, please move if this is not the right place to ask!


I have to move about 8gb of data from one ftp server to another.
I've already tried downloading the lot but the old ftp server is disconnecting me every 5minutes, so this will take forever.
I've tried FxP'ing it from FTP to FTP using FlashFXP, but it didn't work.

Is there a way to do the transfer using SSH (I have ssh access to both servers)? Does anyone has experience with this
or knows a good tutorial/website about these procedures?


Thanks in advance,


thiscatis

Re: Huge data transfer

Posted: Mon Apr 07, 2008 5:29 am
by JayBird
How about this.

SSH into the server with all your data on. Gzip or TAR it all. Make this file accessible from the web root.

Code: Select all

tar -cf file.tar your_dir
Now SSH into you new server and do an fget. This should download the file from server 1 to server 2.

Code: Select all

fget http://server1/file.tar

Re: Huge data transfer

Posted: Mon Apr 07, 2008 5:36 am
by Chris Corbyn
Use the scp command which is just a "cp" over an SSH connection.

Example:

Code: Select all

scp -r username@server.tld:/path/to/file/or/directory /path/to/local/copy
If you need to copy from the local machine to the remote server just reverse the arguments.

It will be a lot faster if you tar the files first. Copying separate files is very slow.

Re: Huge data transfer

Posted: Mon Apr 07, 2008 5:40 am
by thiscatis
JayBird wrote:How about this.

SSH into the server with all your data on. Gzip or TAR it all. Make this file accessible from the web root.

Code: Select all

tar -cf file.tar your_dir
Now SSH into you new server and do an fget. This should download the file from server 1 to server 2.

Code: Select all

fget http://server1/file.tar
I think wget works better ;).

Thanks to both of you!
I'm going to have a try!

Re: Huge data transfer

Posted: Mon Apr 07, 2008 9:34 am
by thiscatis
USING: wget worked for smaller files.

USING SSH user@host -C "path" | dd of=file command > error, file to big

USING SCP -r user@host:path/file newpath > permission denied on file (but permissions are 0777)

:s

// is there a way to tar it and pipe it to the other host immediately without writing the file on the local disk?

Re: Huge data transfer

Posted: Mon Apr 07, 2008 12:47 pm
by JayBird
thiscatis wrote:
I think wget works better ;).
Whoops :banghead: