Page 1 of 1

Remote files that are too big (?) do not load correctly

Posted: Tue Aug 17, 2004 4:54 am
by hstr
Problem: I want to include HTML generated by another web application inside a PHP page. I do it like this:
<?php
$url = "http://foo.bar?";
$url .= $_SERVER["QUERY_STRING"];
@readfile($url);
?>

Depending on the size of the returned HTML string (i.e. depending on the QUERY_STRING) there are three problems I encounter:

1. The $url is fetched many times (up to 20), looks like an automatic reload.
2. Sometimes the browser tells finally, that the whole PHP page (!) cannot be found (but when I use another QUERY_STRING, i.e. empty) the PHP page is there again)
3. The returned HTML is intermingled with strange characters: ?0M? which definitely don't come from the web application (at foo.bar). They appear each time at different positions.

N.B.: With small results (up to 4K) everything works perfect: no automatic reload, no "page not found", no strange characters.

Has anyone ever encountered this or a similar problem?

Thanks in advance

Posted: Tue Aug 17, 2004 10:22 am
by feyd
try using [php_man]curl[/php_man].. or [php_man]file_get_contents[/php_man].. [php_man]readfile[/php_man] sends the data directly to the browser..

file_get_contents doesn't work properly neither

Posted: Wed Aug 18, 2004 1:42 am
by hstr
I have the same problems with file_get_contents.
Does it matter that there is no content-length in the http header of the remote web application?

Posted: Wed Aug 18, 2004 1:45 am
by feyd
I've never had a problem loading the data from a remote server that didn't have Content-Length...