Page 1 of 1

Problem with remote reading and loading...

Posted: Thu Oct 21, 2004 2:19 pm
by sturmovik
Hi all. This is my problem.

I'm trying to remotely read a file and write it to my server.
I've managed to do this, but, I have a slight problem:
When I have a link like this one:

http://www.bfpirates.com/links.php?action=go&link_id=34

I don't know the actual name of the file I want to read so I can't read it remotely with the fopen function.
When I actually click it, everything works just fine, and I can download the file normally but when I try to open it using fopen, I get an error saying CGI Error
The specified CGI application misbehaved by not returning a complete set of HTTP headers.
.

Is there a way to read remotely (probably they are somehow redirected) files placed at links like this one?

I think it should be possible because web browser can get a direct link, so I should be able to get it too.

Please help!

I have 1 more problem:

I use this code:

Code: Select all

function tool_copy($sourceFile, $destinationDirectory, $newFile, $cSpeed=65536)
{
	$fullRemoteFilePath = $sourceFile;
	$destinationPath = $destinationDirectory;
	$newFileName = $newFile;
	$speed = $cSpeed;

	$cpdata = ""; //buffer

	$check = @fopen($fullRemoteFilePath,"rb");
	if(!$check)
	{
  print("<br />Fatal error: requested file ($fullRemoteFilePath) is NOT reachable!<br />");
  print("Please check if the file path is correct and that you are grandet full access to the file!<br />");
  print("<a href="remoteCopy.php">Try again!</a>");
  fclose($fp);
	}
	else
	{
  $fp = @fopen($fullRemoteFilePath, "rb");
  do {
  	$data = fread($fp, $speed);
  	if (strlen($data) == 0) {
    break;
  	}
  	$cpdata .= $data;
  } while(true);
  fclose($fp);

  $checked = @fopen($destinationPath."".$newFileName, "wb+");
  if(!checked)
  {
  	print("<br />Fatal error: destination path of filename ($fullRemoteFilePath) is NOT accessible!<br />");
  	print("Please check if the file path is correct and that you are grandet full access to the path(file)!<br />");
  	print("<a href="remoteCopy.php">Try again!</a>");
  	fclose($fp);
  }
  else
  {
  	$fp = @fopen($destinationPath."".$newFileName, "wb+");
  	@fwrite($fp, $cpdata);
  	fclose($fp);
  	print("Requested file coppied correctly!<br />");
  	print("<a href="remoteCopy.php">Try a next one?</a>");
  }
	}
}
But, on 1 server I have a problem with script execution time :
Fatal error: Maximum execution time of 30 seconds exceeded in .....\remoteCopy.php on line 42
, and on the other, I have an error that says: Fatal error: Allowed memory size of 12582912 bytes exhausted (tried to allocate 65537 bytes) in /home/.../sas-ukclan.co.uk/user/htdocs/remoteCopy.php on line 42


When 1st error ocured (timeout) I tried to copy http://www.nuspojava.com/pvt/packet.zip


When 2nd error occured, I tried to copy http://www.bfpirates.com/links.php?action=go&link_id=34 file.

Well, http://www.nuspojava.com/pvt/packet.zip works just fine on the second server that reported second error on the bfpirates... etc file.


Please help!

Posted: Thu Oct 21, 2004 2:39 pm
by Weirdan
[php_man]curl[/php_man] might be a nice alternative to 'remote fopen'.

Posted: Thu Oct 21, 2004 3:05 pm
by sturmovik
Well, there is no remote fopen function... It's a simple fopen for binary file reading.. well, it should work because it should work quite normally...

I would probably think about using CURL, but that would require server modifications that are not accessible.
I need to create a script that is able to do file downloading all by itself between two servers...

Posted: Thu Oct 21, 2004 4:40 pm
by rehfeld
i think your code could be simplified a bit. I dont expect this to solve your problem with that url, but it might. seems like theres a lot of fopen()'s, much more than needed. If nothing else, this will be faster, and use less resources. And if your using php5, scrap the writeit() function and use file_put_contents()

the file_*_contents() functions are more efficient than using fopen()

Code: Select all

<?php


function writeit($filename, $data)
{
    $success = false;
    $fp = @fopen($filename, 'w');

    if ($fp) {
        if (@fwrite($fp, $data)) $success = true;
        fclose($fp);
    }

    return $success;
}




$data = @file_get_contents($url);

if (!$data) {
    echo 'url no work';
} else {
    if (writeit($filename, $data)) {
        echo 'saved';
    } else {
        echo 'could not save';
    }
}





?>

Posted: Thu Oct 21, 2004 4:51 pm
by rehfeld
i just actually checked on the link :oops: , and saw its actually trying to feed a file from

ftp://82.129.40.118/figh7club/battlefie ... bfpirates/

so yeah, it definately wont work

doesnt pear have some alternative to curl? i thought they did. might wanna look into it.

Posted: Thu Oct 21, 2004 5:19 pm
by sturmovik
Well, this script must work on PHP4 too... :?
rehfeld wrote:i just actually checked on the link :oops: , and saw its actually trying to feed a file from

ftp://82.129.40.118/figh7club/battlefie ... bfpirates/

so yeah, it definately wont work

doesnt pear have some alternative to curl? i thought they did. might wanna look into it.
Why wouldn't that work? Please explain if you can... I'm a bit of a noob in php, although I program for a long time...

Posted: Thu Oct 21, 2004 5:53 pm
by rehfeld
well the code i posted will work on php4. try getting a html page, or an image, it will work i promise. try getting http://w3c.org

the problem is that the server at the first url is sending a header informing the browser that the file it wants is at the ftp://url

fopen, file_get_contents etc... dont interpet headers. They also do not support the ftp protocol

you could use fsockopen() but be prepared to do some coding...

try searching hotscripts.com or similar, im sure theres pre written scripts that do what you want. you could at least, get an idea how its done