Page 1 of 1
Transfer large files via ftp
Posted: Wed Jul 18, 2007 3:10 am
by miro_igov
Hello,
i need to transfer 12Gb file from linux machine to other linux machine via ftp.
I used the shell command "ftp" but it times out after 900 seconds and closes the connection which causes the file to upload partially.
How can i set higher timeout or no timeout?
Posted: Wed Jul 18, 2007 7:56 am
by Chris Corbyn
It's most likely that the remote server where you're sending the file to has a timeout set in its configuration. Check that server's ftpd configuration.
Posted: Wed Jul 18, 2007 8:14 am
by miro_igov
I have no permissions on the remote server. Is it not possible setting some mode that does not allow timeouts while a file transfer goes?
Posted: Wed Jul 18, 2007 8:57 am
by Jenk
Not if the server disallows it, no. Perhaps split the file into smaller chunks and upload those independently. (You would then need to instruct anyone who downloads it to combine them before using.)
Posted: Wed Jul 18, 2007 2:36 pm
by Chris Corbyn
Jenk wrote:Not if the server disallows it, no. Perhaps split the file into smaller chunks and upload those independently. (You would then need to instruct anyone who downloads it to combine them before using.)
Or if the server runs PHP, try this:
split.php
Code: Select all
<?php
//Just set the filename in $input below
// and make the directory writable
// files with a 1MB size will be created
$input = "some-file.ext";
$chunk_size = 1024 * 1024; //1MB
$handle = fopen($input, "rb");
$in_this_chunk = 0;
$chunk_number = 0;
$out_fp = null;
while (false !== $bytes = fread($handle, 8192))
{
if ($out_fp === null)
{
$chunk_number++;
$part_name = $input . ".part." . $chunk_number;
echo "Creating part " . $chunk_number . " - " . $part_name . "...<br />";
$out_fp = fopen($part_name, "wb");
}
$in_this_chunk += 8192;
fwrite($out_fp, $bytes);
if ($in_this_chunk >= $chunk_size)
{
fclose($out_fp);
$out_fp = null;
}
}
if ($out_fp !== null)
{
fclose($out_fp);
}
echo "Done!<br />";
join.php
Code: Select all
<?php
//Just set $filename to the original filename and run this script
$filename = "input-file.ext";
$parts = array();
$handle = opendir(".");
while (false !== $file = readdir($handle))
{
if (!is_dir($file) && preg_match("~^" . $filename . "\\.part\\.([0-9]+)\$~D", $file, $matches))
{
$parts[$matches[1]] = $filename;
}
}
closedir($handle);
echo "Found " . count($parts) . " parts...<br />";
$out_fp = fopen($filename, "wb");
ksort($parts, SORT_NUMERIC);
foreach ($parts as $file)
{
echo "Merging " . $file . "...<br />";
$in_fp = fopen($file, "rb");
while (false !== $bytes = fread($in_fp, 8192))
{
fwrite($out_fp, $bytes);
}
fclose($in_fp);
}
fclose($out_fp);
echo "Done!<br />";
Now you can split the file into pieces on your computer, then join it all back up again on the server

Posted: Thu Jul 19, 2007 4:36 am
by facets
also worth checking lftp. it can 'automatically' split the file up and open 5 connections. Although it may flood your network.
lftp ftp.server.com -u name,password
pget -n5 source dest
hth.
Will