Page 1 of 1

Parse a page and Grab URL

Posted: Wed Jul 21, 2004 5:03 am
by Simon Angell
Hi All.
I am looking for suggestions on how to achieve this...
AN external website (BoM) updates it weather warnings page constantly, i need a script to parse the warnings page every minute (will so a cronjob for that part), and on running, the script parses the site to find any added links (warnings) since the last run of the script. Upon "seeing" a new warning, it then downloads the page associated with that warning to my server.

Now i already have a script that downloads the warnings manually, and want to intergrate that, so really need the parse/automation add on....

any ideas?

Posted: Wed Jul 21, 2004 5:24 am
by qads
Hi,

you could log the warnings somewhere, db is good for it, when checking for new ones, compare them to the one you already have, i.e.

Code: Select all

<?php
$check = mysql_num_rows(mysql_query("select warnings from table where warning like '$current' limit 1"));
if($check == 0)
{
//its a new one, log it...
}
?>
as for grabing the url, well...theres CURL, or just simple fopen maybe? depends on how the information is accessed.