Huge data transfer

Ye' old general discussion board. Basically, for everything that isn't covered elsewhere. Come here to shoot the breeze, shoot your mouth off, or whatever suits your fancy.
This forum is not for asking programming related questions.

Moderator: General Moderators

Post Reply
thiscatis
Forum Contributor
Posts: 434
Joined: Thu Jul 20, 2006 11:00 am

Huge data transfer

Post by thiscatis »

Hi guys,


I was not sure where to put this... so mods, please move if this is not the right place to ask!


I have to move about 8gb of data from one ftp server to another.
I've already tried downloading the lot but the old ftp server is disconnecting me every 5minutes, so this will take forever.
I've tried FxP'ing it from FTP to FTP using FlashFXP, but it didn't work.

Is there a way to do the transfer using SSH (I have ssh access to both servers)? Does anyone has experience with this
or knows a good tutorial/website about these procedures?


Thanks in advance,


thiscatis
User avatar
JayBird
Admin
Posts: 4524
Joined: Wed Aug 13, 2003 7:02 am
Location: York, UK
Contact:

Re: Huge data transfer

Post by JayBird »

How about this.

SSH into the server with all your data on. Gzip or TAR it all. Make this file accessible from the web root.

Code: Select all

tar -cf file.tar your_dir
Now SSH into you new server and do an fget. This should download the file from server 1 to server 2.

Code: Select all

fget http://server1/file.tar
User avatar
Chris Corbyn
Breakbeat Nuttzer
Posts: 13098
Joined: Wed Mar 24, 2004 7:57 am
Location: Melbourne, Australia

Re: Huge data transfer

Post by Chris Corbyn »

Use the scp command which is just a "cp" over an SSH connection.

Example:

Code: Select all

scp -r username@server.tld:/path/to/file/or/directory /path/to/local/copy
If you need to copy from the local machine to the remote server just reverse the arguments.

It will be a lot faster if you tar the files first. Copying separate files is very slow.
thiscatis
Forum Contributor
Posts: 434
Joined: Thu Jul 20, 2006 11:00 am

Re: Huge data transfer

Post by thiscatis »

JayBird wrote:How about this.

SSH into the server with all your data on. Gzip or TAR it all. Make this file accessible from the web root.

Code: Select all

tar -cf file.tar your_dir
Now SSH into you new server and do an fget. This should download the file from server 1 to server 2.

Code: Select all

fget http://server1/file.tar
I think wget works better ;).

Thanks to both of you!
I'm going to have a try!
thiscatis
Forum Contributor
Posts: 434
Joined: Thu Jul 20, 2006 11:00 am

Re: Huge data transfer

Post by thiscatis »

USING: wget worked for smaller files.

USING SSH user@host -C "path" | dd of=file command > error, file to big

USING SCP -r user@host:path/file newpath > permission denied on file (but permissions are 0777)

:s

// is there a way to tar it and pipe it to the other host immediately without writing the file on the local disk?
User avatar
JayBird
Admin
Posts: 4524
Joined: Wed Aug 13, 2003 7:02 am
Location: York, UK
Contact:

Re: Huge data transfer

Post by JayBird »

thiscatis wrote:
I think wget works better ;).
Whoops :banghead:
Post Reply