Page 1 of 1

Problem getting image via URL

Posted: Mon Aug 10, 2009 12:52 pm
by tk5ep
Hi all,

I need to periodicaly download an animated image from URL : http://www2.meteox.com/radareu.php in order to display it as an overlay in Google maps and to split it in frames.

If it takes only a few ms to display this image with a browser, it takes 4-5 s for my script to download it, and sometimes it is corrupted.
I tried with COPY , FOPEN + FREAD commands, etc.... but none did give me good results. I suppose it is due to the fact that the URL is a php and the received datas a GIF file.

Has anyone an idea how to speed up and secure the download from a php script on my providers server ? I could not read the original php code to see where to get the frames BEFORE it builds the animated image.

Any help is welcome,
Patrick

Re: Problem getting image via URL

Posted: Mon Aug 10, 2009 1:10 pm
by frao_0
Have you tried cURL?

Re: Problem getting image via URL

Posted: Mon Aug 10, 2009 2:47 pm
by tk5ep
Not yet...

I tried just now, but it's even worse !
I have a timeout error after 2 s....

here the few lines i wrote this evening.
$src being my source URL
$dest my local file

-------------------------------------
$out = fopen($dest, 'wb');
$ch = curl_init($src);
curl_setopt($ch, CURLOPT_FILE, $out);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch); ;
fclose($out);
--------------------------------------

I'm stucked !!

Re: Problem getting image via URL

Posted: Tue Aug 11, 2009 11:05 am
by tk5ep
Has nobody an (good) idea ??
:cry:

Re: Problem getting image via URL

Posted: Tue Aug 11, 2009 2:46 pm
by izzy4505
If it works some times and not others, I'd take a look at what your ISP is doing. Perhaps that box your site is on is quite loaded down, or maybe there isn't enough bandwidth to do it in a timely manner.

At any rate, you should be caching this image anyway. If 100 people hit your site all at once, your script shouldn't have to download that image 100 times.

One thing you can do is set up a script that does nothing but download this image, and then schedule a cron job to run it. If your ISP doesn't let you play with cron, then are a few free services out there that will hit a given URL with a schedule that you set up. Alternatively, set your script to check the timestamp on the local file before trying to download it again.

Changing methods of downloading this remote image won't change much. cURL is definitely the most robust way, but if allowed, file_get_contents() should work just as well, assuming you don't need authentication or anything.

Re: Problem getting image via URL

Posted: Wed Aug 12, 2009 1:24 am
by tk5ep
Hi izzi4505,

You're right on almost everything you wrote.
In fact, my actual page : http://tk5ep.free.fr/rsmap/rsmap.php is the first version of my script. It is written in javascript around the Google maps API. Only the overlay image is reloaded every 10 min by the client on the URL of the weather site where the image is.

The idea for my second version is (was) that the server is loading the image and the client gets it from my ISP, not the distant server.
I can't access to the CRON, so i wrote a code that checks the timestamp of last version of the image and the first client who detects that it is older than 10min is firing the download. But my ISP has apparently different servers and they don't have the same time. There is a 3 mlin difference between the filemtime and time() ! ...

All that introduces more problems than the first version... I'm still not sure if it is the right way to do it.

I tried almost everything around cURL, but can't get a good result. The simpliest way is in my case the most robust !
I'm now convinced that the problem comes from my ISP that has some restrictions or priorities on the outgoing traffic...
But i don't understand why cURL can't make a better job the the classical COPY or FILEOPEN way...

Thanks for your post,
EP