Hi All.
I am looking for suggestions on how to achieve this...
AN external website (BoM) updates it weather warnings page constantly, i need a script to parse the warnings page every minute (will so a cronjob for that part), and on running, the script parses the site to find any added links (warnings) since the last run of the script. Upon "seeing" a new warning, it then downloads the page associated with that warning to my server.
Now i already have a script that downloads the warnings manually, and want to intergrate that, so really need the parse/automation add on....
any ideas?
Parse a page and Grab URL
Moderator: General Moderators
-
Simon Angell
- Forum Commoner
- Posts: 45
- Joined: Fri Jan 24, 2003 12:14 am
Hi,
you could log the warnings somewhere, db is good for it, when checking for new ones, compare them to the one you already have, i.e.
as for grabing the url, well...theres CURL, or just simple fopen maybe? depends on how the information is accessed.
you could log the warnings somewhere, db is good for it, when checking for new ones, compare them to the one you already have, i.e.
Code: Select all
<?php
$check = mysql_num_rows(mysql_query("select warnings from table where warning like '$current' limit 1"));
if($check == 0)
{
//its a new one, log it...
}
?>