Transfer large files via ftp

Whether you are using Linux on the desktop or as a server, it's still good that you're using Linux. Linux related questions go here.

Moderator: General Moderators

Post Reply
miro_igov
Forum Contributor
Posts: 485
Joined: Fri Mar 31, 2006 5:06 am
Location: Bulgaria

Transfer large files via ftp

Post by miro_igov »

Hello,

i need to transfer 12Gb file from linux machine to other linux machine via ftp.

I used the shell command "ftp" but it times out after 900 seconds and closes the connection which causes the file to upload partially.

How can i set higher timeout or no timeout?
User avatar
Chris Corbyn
Breakbeat Nuttzer
Posts: 13098
Joined: Wed Mar 24, 2004 7:57 am
Location: Melbourne, Australia

Post by Chris Corbyn »

It's most likely that the remote server where you're sending the file to has a timeout set in its configuration. Check that server's ftpd configuration.
miro_igov
Forum Contributor
Posts: 485
Joined: Fri Mar 31, 2006 5:06 am
Location: Bulgaria

Post by miro_igov »

I have no permissions on the remote server. Is it not possible setting some mode that does not allow timeouts while a file transfer goes?
User avatar
Jenk
DevNet Master
Posts: 3587
Joined: Mon Sep 19, 2005 6:24 am
Location: London

Post by Jenk »

Not if the server disallows it, no. Perhaps split the file into smaller chunks and upload those independently. (You would then need to instruct anyone who downloads it to combine them before using.)
User avatar
Chris Corbyn
Breakbeat Nuttzer
Posts: 13098
Joined: Wed Mar 24, 2004 7:57 am
Location: Melbourne, Australia

Post by Chris Corbyn »

Jenk wrote:Not if the server disallows it, no. Perhaps split the file into smaller chunks and upload those independently. (You would then need to instruct anyone who downloads it to combine them before using.)
Or if the server runs PHP, try this:

split.php

Code: Select all

<?php

//Just set the filename in $input below
// and make the directory writable
// files with a 1MB size will be created

$input = "some-file.ext";

$chunk_size = 1024 * 1024; //1MB
$handle = fopen($input, "rb");
$in_this_chunk = 0;
$chunk_number = 0;
$out_fp = null;
while (false !== $bytes = fread($handle, 8192))
{
  if ($out_fp === null)
  {
    $chunk_number++;
    $part_name = $input . ".part." . $chunk_number;
    echo "Creating part " . $chunk_number . " - " . $part_name . "...<br />";
    $out_fp = fopen($part_name, "wb");
  }
  $in_this_chunk += 8192;
  fwrite($out_fp, $bytes);
  if ($in_this_chunk >= $chunk_size)
  {
    fclose($out_fp);
    $out_fp = null;
  }
}
if ($out_fp !== null)
{
  fclose($out_fp);
}
echo "Done!<br />";
join.php

Code: Select all

<?php

//Just set $filename to the original filename and run this script
$filename = "input-file.ext";

$parts = array();

$handle = opendir(".");
while (false !== $file = readdir($handle))
{
  if (!is_dir($file) && preg_match("~^" . $filename . "\\.part\\.([0-9]+)\$~D", $file, $matches))
  {
    $parts[$matches[1]] = $filename;
  }
}
closedir($handle);

echo "Found " . count($parts) . " parts...<br />";

$out_fp = fopen($filename, "wb");
ksort($parts, SORT_NUMERIC);
foreach ($parts as $file)
{
  echo "Merging " . $file . "...<br />";
  $in_fp = fopen($file, "rb");
  while (false !== $bytes = fread($in_fp, 8192))
  {
    fwrite($out_fp, $bytes);
  }
  fclose($in_fp);
}
fclose($out_fp);
echo "Done!<br />";
Now you can split the file into pieces on your computer, then join it all back up again on the server ;)
facets
Forum Contributor
Posts: 273
Joined: Wed Apr 13, 2005 1:53 am
Location: Detroit

Post by facets »

also worth checking lftp. it can 'automatically' split the file up and open 5 connections. Although it may flood your network.

lftp ftp.server.com -u name,password
pget -n5 source dest

hth.
Will
Post Reply