Problem: Large Files, forcing download

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
filmixt
Forum Newbie
Posts: 2
Joined: Wed Jun 10, 2009 6:45 am

Problem: Large Files, forcing download

Post by filmixt »

Hi everyone,
This is the first time I post here, I hope you could help me with a hard one.

I'm making a project, where I have to force the download and print the content of a 700mb-1gb file. Problems? Well, a lot. Why? You'll see.. :D

If we do this:

Code: Select all

// FILE EXTERNAL[]
      $f = $_GET["f"];
// Size
function urlfilesize($url,$thereturn) {
 
    $x = array_change_key_case(get_headers($url, 1),CASE_LOWER);
    $x = $x['content-length'];
    return $x;
}
// Getting size
$peso = urlfilesize($_GET[f],'');
 
 
// Headers
header("Content-Type: application/download"); 
header("Transfer-Encoding: chunked");
header("Content-Disposition: attachment; filename=\"$f\"\n");
header("Content-Length: ".$peso);
readfile($f);
Once you get to 2MB, the download stops, why?, php hangs.
I've been 4 days looking for an answer or a fix, and I failed. Imagine how I am :)
This script is very easy, lets go onto a harder one... not too much:

Code: Select all

// ARCHIVO
      $f = $_GET["f"];
// PESO
      function urlfilesize($url,$thereturn) {
$x = array_change_key_case(get_headers($url, 1),CASE_LOWER);
$x = $x['content-length'];
return $x;
}
 
// Tengo PESO
$peso = urlfilesize($_GET[f],'');
 
// Headers
      $name = explode("/", $f);
      $nc = count($name);
     $nombre = $name[$nc-1];     
header('Content-Description: File Transfer'); 
header("Content-Type: application/octet-stream"); 
header("Content-Type: application/download"); 
header ("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Content-Disposition: attachment; filename=\"$nombre\"\n");
header("Content-Length: ".$peso);
set_time_limit(0);
 
 
   
      function readfile_chunked($filename,$retbytes=true)
   
      {
   
         $chunksize = 1*(1024*1024); // how many bytes per chunk
   
         $buffer = '';
   
         $cnt =0;
  
         // $handle = fopen($filename, 'rb');
  
         $handle = fopen($filename, 'rb');
  
         if ($handle === false)
  
         {
  
             return false;
  
         }
  
         while (!feof($handle))
  
       {
  
             $buffer = fread($handle, $chunksize);
  
             echo $buffer;
  
             if ($retbytes)
  
             {
  
                 $cnt += strlen($buffer);
  
             }
  
         }
  
         $status = fclose($handle);
  
         if ($retbytes && $status)
  
         {
  
             return $cnt; // return num. bytes delivered like readfile() does.
 
         }
 
         return $status;
 
      }
 
readfile_chunked($f,$peso);
Buffer? The same, it hangs at 2MB.
:D Now what?

I tried:

Code: Select all

header("Content-Type: application/download"); 
header("Transfer-Encoding: chunked");
header("Content-Disposition: attachment; filename=\"$f\"\n");
header("Content-Length: ".$peso);
header("Location: $f");
Funny? Well.. desperate. It doesnt work, it just redirects ;-)

I tried to install lighttpd module, and it just works with LOCAL files, haha, not useful...

So, in conclusion, Im f**ked up.
I hope there's a master here to show us the way, or someone with an idea, because I can't handle more...

Thanks everyone for the time.
Last edited by Benjamin on Wed Jun 10, 2009 11:45 am, edited 1 time in total.
Reason: Changed code type from text to php.
User avatar
mikemike
Forum Contributor
Posts: 355
Joined: Sun May 24, 2009 5:37 pm
Location: Chester, UK

Re: Problem: Large Files, forcing download

Post by mikemike »

I don't have time to go though your code properly but off hand it sounds like it might be a configuration issue. Do you have error_reporting set high and display_errors on? Make sure memory_limit is high enough, and the other similar config vars are sufficient
filmixt
Forum Newbie
Posts: 2
Joined: Wed Jun 10, 2009 6:45 am

Re: Problem: Large Files, forcing download

Post by filmixt »

Well, I can tell you that I got good knowledge about configuring with this problem, and I can asure you all its setted up unlimitedly.

Ill show you:
memory_limit = -1
max_execution_time = 5400
max_input_time = 5400

The maximum memory a cPanel process can use before it is killed off (in megabytes). Values less than 128 megabytes can not be specified. A value of "0" will disable the memory limits.
= SET TO 0

Im using:
set_time_limit(0);
In case.

Its not about config, its about a bug, and I need other ideas, or another method, out of php, or apache, anything.
Post Reply