PHP Logger
Posted: Wed Oct 20, 2010 3:27 pm
Greetings,
To practice the PHP I've learned in the last few weeks, a friend of mine as came up with an interesting challenge. The challenge is to have a constant monitoring of a certain web page using a server. The challenge details:
1) A destination, say, examplesite.com/PHP_challenge.php, contains a number that changes over time
2) Find this number, and log the time as well as the value into a file, log.txt
3) check every 30 minutes, and be able to stop when a file named stop.txt appears in the same directory as the script
I can do steps 1 and 2 without problem, but the main problem is checking every 30 minutes. (Please don't suggest that I make a C++ or javascript program, I can do it, but any client side programming relies on the client to be online all the time and have uninterrupted internet access 24/7, and the reason why there is a challenge is because I can't leave my computer online 24/7). At first, I thought I would put the script into an infinite loop that breaks when stop.txt exists, but then I found out about the restriction of runtime of PHP scripts. On the hosting I have, the maximum runtime of a PHP script is 60 seconds.
I came up with an idea that involves a basic loop between two pages. The idea is that the script (let's name it script1.php) opens the log.txt, finds out the last time it checked the site, if it's more than 30 minutes, it checks the site and writes the values to the file, but if it's less than 30 minutes, it sleeps for 45 seconds and then calls script2.php that sleeps for 45 seconds and then calls script1.php, creating a loop. In script1.php, it first checks to see if stop.txt exists, and if it does stop, or if not, repeat the cycle again. So, before I waste any more time writing code that doesn't work, may I ask if this idea would work? I can set the timeout of the CURL session to 45+ seconds (I know that the script will wait 45 seconds before loading, so I have to set the time limit of the CURL session higher to prevent any timeouts). And any other ideas that would cost less bandwidth and CPU time? I'm all ears
To practice the PHP I've learned in the last few weeks, a friend of mine as came up with an interesting challenge. The challenge is to have a constant monitoring of a certain web page using a server. The challenge details:
1) A destination, say, examplesite.com/PHP_challenge.php, contains a number that changes over time
2) Find this number, and log the time as well as the value into a file, log.txt
3) check every 30 minutes, and be able to stop when a file named stop.txt appears in the same directory as the script
I can do steps 1 and 2 without problem, but the main problem is checking every 30 minutes. (Please don't suggest that I make a C++ or javascript program, I can do it, but any client side programming relies on the client to be online all the time and have uninterrupted internet access 24/7, and the reason why there is a challenge is because I can't leave my computer online 24/7). At first, I thought I would put the script into an infinite loop that breaks when stop.txt exists, but then I found out about the restriction of runtime of PHP scripts. On the hosting I have, the maximum runtime of a PHP script is 60 seconds.
I came up with an idea that involves a basic loop between two pages. The idea is that the script (let's name it script1.php) opens the log.txt, finds out the last time it checked the site, if it's more than 30 minutes, it checks the site and writes the values to the file, but if it's less than 30 minutes, it sleeps for 45 seconds and then calls script2.php that sleeps for 45 seconds and then calls script1.php, creating a loop. In script1.php, it first checks to see if stop.txt exists, and if it does stop, or if not, repeat the cycle again. So, before I waste any more time writing code that doesn't work, may I ask if this idea would work? I can set the timeout of the CURL session to 45+ seconds (I know that the script will wait 45 seconds before loading, so I have to set the time limit of the CURL session higher to prevent any timeouts). And any other ideas that would cost less bandwidth and CPU time? I'm all ears