Problem getting image via URL

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
tk5ep
Forum Newbie
Posts: 4
Joined: Mon Aug 10, 2009 12:42 pm

Problem getting image via URL

Post by tk5ep »

Hi all,

I need to periodicaly download an animated image from URL : http://www2.meteox.com/radareu.php in order to display it as an overlay in Google maps and to split it in frames.

If it takes only a few ms to display this image with a browser, it takes 4-5 s for my script to download it, and sometimes it is corrupted.
I tried with COPY , FOPEN + FREAD commands, etc.... but none did give me good results. I suppose it is due to the fact that the URL is a php and the received datas a GIF file.

Has anyone an idea how to speed up and secure the download from a php script on my providers server ? I could not read the original php code to see where to get the frames BEFORE it builds the animated image.

Any help is welcome,
Patrick
frao_0
Forum Commoner
Posts: 27
Joined: Sat Aug 08, 2009 3:52 am
Location: Toulouse, France

Re: Problem getting image via URL

Post by frao_0 »

Have you tried cURL?
tk5ep
Forum Newbie
Posts: 4
Joined: Mon Aug 10, 2009 12:42 pm

Re: Problem getting image via URL

Post by tk5ep »

Not yet...

I tried just now, but it's even worse !
I have a timeout error after 2 s....

here the few lines i wrote this evening.
$src being my source URL
$dest my local file

-------------------------------------
$out = fopen($dest, 'wb');
$ch = curl_init($src);
curl_setopt($ch, CURLOPT_FILE, $out);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch); ;
fclose($out);
--------------------------------------

I'm stucked !!
tk5ep
Forum Newbie
Posts: 4
Joined: Mon Aug 10, 2009 12:42 pm

Re: Problem getting image via URL

Post by tk5ep »

Has nobody an (good) idea ??
:cry:
izzy4505
Forum Newbie
Posts: 16
Joined: Tue Aug 11, 2009 1:38 pm

Re: Problem getting image via URL

Post by izzy4505 »

If it works some times and not others, I'd take a look at what your ISP is doing. Perhaps that box your site is on is quite loaded down, or maybe there isn't enough bandwidth to do it in a timely manner.

At any rate, you should be caching this image anyway. If 100 people hit your site all at once, your script shouldn't have to download that image 100 times.

One thing you can do is set up a script that does nothing but download this image, and then schedule a cron job to run it. If your ISP doesn't let you play with cron, then are a few free services out there that will hit a given URL with a schedule that you set up. Alternatively, set your script to check the timestamp on the local file before trying to download it again.

Changing methods of downloading this remote image won't change much. cURL is definitely the most robust way, but if allowed, file_get_contents() should work just as well, assuming you don't need authentication or anything.
tk5ep
Forum Newbie
Posts: 4
Joined: Mon Aug 10, 2009 12:42 pm

Re: Problem getting image via URL

Post by tk5ep »

Hi izzi4505,

You're right on almost everything you wrote.
In fact, my actual page : http://tk5ep.free.fr/rsmap/rsmap.php is the first version of my script. It is written in javascript around the Google maps API. Only the overlay image is reloaded every 10 min by the client on the URL of the weather site where the image is.

The idea for my second version is (was) that the server is loading the image and the client gets it from my ISP, not the distant server.
I can't access to the CRON, so i wrote a code that checks the timestamp of last version of the image and the first client who detects that it is older than 10min is firing the download. But my ISP has apparently different servers and they don't have the same time. There is a 3 mlin difference between the filemtime and time() ! ...

All that introduces more problems than the first version... I'm still not sure if it is the right way to do it.

I tried almost everything around cURL, but can't get a good result. The simpliest way is in my case the most robust !
I'm now convinced that the problem comes from my ISP that has some restrictions or priorities on the outgoing traffic...
But i don't understand why cURL can't make a better job the the classical COPY or FILEOPEN way...

Thanks for your post,
EP
Post Reply