Hi guys,
I was not sure where to put this... so mods, please move if this is not the right place to ask!
I have to move about 8gb of data from one ftp server to another.
I've already tried downloading the lot but the old ftp server is disconnecting me every 5minutes, so this will take forever.
I've tried FxP'ing it from FTP to FTP using FlashFXP, but it didn't work.
Is there a way to do the transfer using SSH (I have ssh access to both servers)? Does anyone has experience with this
or knows a good tutorial/website about these procedures?
Thanks in advance,
thiscatis
Huge data transfer
Moderator: General Moderators
Re: Huge data transfer
How about this.
SSH into the server with all your data on. Gzip or TAR it all. Make this file accessible from the web root.
Now SSH into you new server and do an fget. This should download the file from server 1 to server 2.
SSH into the server with all your data on. Gzip or TAR it all. Make this file accessible from the web root.
Code: Select all
tar -cf file.tar your_dirCode: Select all
fget http://server1/file.tar- Chris Corbyn
- Breakbeat Nuttzer
- Posts: 13098
- Joined: Wed Mar 24, 2004 7:57 am
- Location: Melbourne, Australia
Re: Huge data transfer
Use the scp command which is just a "cp" over an SSH connection.
Example:
If you need to copy from the local machine to the remote server just reverse the arguments.
It will be a lot faster if you tar the files first. Copying separate files is very slow.
Example:
Code: Select all
scp -r username@server.tld:/path/to/file/or/directory /path/to/local/copyIt will be a lot faster if you tar the files first. Copying separate files is very slow.
Re: Huge data transfer
I think wget works betterJayBird wrote:How about this.
SSH into the server with all your data on. Gzip or TAR it all. Make this file accessible from the web root.
Now SSH into you new server and do an fget. This should download the file from server 1 to server 2.Code: Select all
tar -cf file.tar your_dir
Code: Select all
fget http://server1/file.tar
Thanks to both of you!
I'm going to have a try!
Re: Huge data transfer
USING: wget worked for smaller files.
USING SSH user@host -C "path" | dd of=file command > error, file to big
USING SCP -r user@host:path/file newpath > permission denied on file (but permissions are 0777)
:s
// is there a way to tar it and pipe it to the other host immediately without writing the file on the local disk?
USING SSH user@host -C "path" | dd of=file command > error, file to big
USING SCP -r user@host:path/file newpath > permission denied on file (but permissions are 0777)
:s
// is there a way to tar it and pipe it to the other host immediately without writing the file on the local disk?
Re: Huge data transfer
Whoopsthiscatis wrote:
I think wget works better.