Run PHP scripts in background for SSH or PHP 4 ... 12 hours?

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
Deseree
Forum Commoner
Posts: 84
Joined: Mon Feb 13, 2006 11:35 pm

Run PHP scripts in background for SSH or PHP 4 ... 12 hours?

Post by Deseree »

Hey guys....

how would I go about doing this....

say I have a php script that runs for 12-24 hours when running in shell.....


/usr/bin/php /path/to/php/script.php 'whatever=vars&etc=etc'
that is how I do it now. it was going for 12 hours and then my isp started having problems for an hour and I got really bad latency, and the ssh stopped.... so I think that cancelled the php script yes?

i had ignore_user_abort(1) set and set_time_limit(0) in place, i just don't think it finished.


so my question is... how can i have it so my php script, we use call the script from http, and input the vars and click go, and it STARTS, but the browser doesn't like keep going for 12 hours, you know the browser will crash after like 6 hours usually for me or times out after 3-4 hours, even with flush() and echoing a "." every once in a while to keep the browser alive.....

I want to be able to simply start the php script from php and it goes but i can close the browser without any worries, or go to other pages on the script, ( this is the import page i'm talking about, it imports data for 12 hours or more depending on how big the file is ).

Is that possible to do that part from the browser to call it and it runs, and it would be nice if i could see if it's already running ( monitor it's progress every once in a while), and also prevent another copy of the import from running if called again and it's already running....
josh
DevNet Master
Posts: 4872
Joined: Wed Feb 11, 2004 3:23 pm
Location: Palm beach, Florida

Post by josh »

Code: Select all

nohup <command> &
nohup makes the command immune to hangups and the & puts the job in the background, if this is going to be running "in the background" you might want to nice it to give it a lower cpu priority

Code: Select all

nice -n 19 <command>
Note the PID you get from doing &, because to stop it you need to do

Code: Select all

kill <pid>
timvw
DevNet Master
Posts: 4897
Joined: Mon Jan 19, 2004 11:11 pm
Location: Leuven, Belgium

Post by timvw »

I usually start "screen" and then open virtual terminals in there... So i can attach and detach anytime i want... (nice for irc-client etc too)

http://gentoo-wiki.com/TIP_Using_screen
Deseree
Forum Commoner
Posts: 84
Joined: Mon Feb 13, 2006 11:35 pm

Post by Deseree »

timvw wrote:I usually start "screen" and then open virtual terminals in there... So i can attach and detach anytime i want... (nice for irc-client etc too)

http://gentoo-wiki.com/TIP_Using_screen
not quite working...

so basically i do this:

screen
/usr/bin/php /path/to/script.php
screen -r


but i can't input after running the php script...

screen
/usr/bin/php /path/to/script.php && screen -r

Does that work and if so how can i swtich back in to check it's progress?

There really isn't ANY way I can start the php script from http ?
timvw
DevNet Master
Posts: 4897
Joined: Mon Jan 19, 2004 11:11 pm
Location: Leuven, Belgium

Post by timvw »

Deseree wrote: screen
/usr/bin/php /path/to/script.php
At this point you have one screen open and you're running the php script in it.

You would detach with ctrl+a d (if i'm not mistaken, i changed my key mapping).

If you want to attach to the session again, you type: screen -D -r

Anyway, you can only give (extra) input to your script if your script is actually accepting input.
If you simply want another screen/terminal you can press ctrl+a n and you get a new screen..
Deseree wrote:There really isn't ANY way I can start the php script from http ?
Offcourse you can, fetch with wget, curl, ...
Deseree
Forum Commoner
Posts: 84
Joined: Mon Feb 13, 2006 11:35 pm

Post by Deseree »

timvw wrote:
Deseree wrote: screen
/usr/bin/php /path/to/script.php
At this point you have one screen open and you're running the php script in it.

You would detach with ctrl+a d (if i'm not mistaken, i changed my key mapping).

If you want to attach to the session again, you type: screen -D -r

Anyway, you can only give (extra) input to your script if your script is actually accepting input.
If you simply want another screen/terminal you can press ctrl+a n and you get a new screen..
Deseree wrote:There really isn't ANY way I can start the php script from http ?
Offcourse you can, fetch with wget, curl, ...
interesting....

on ctrl + a + d worked for detaching.....
If you simply want another screen/terminal you can press ctrl+a n and you get a new screen..
didn't work, it'd be nice jus to know how to do that without having to reenter the passsword, since i use really long ramdom digit keys.
Anyway, you can only give (extra) input to your script if your script is actually accepting input.
so you mean

/usr/bin/php /path/to/script.php 'run=insert'

is all I can do? I can't do :

/usr/bin/php /path/to/script.php 'run=insert&nextvar=2&etc=3'

? If i can't that's still cool, i can hard program in one var and call that var only from ssh to set other vars'.

Offcourse you can, fetch with wget, curl, ...
Would this call my script and end the CALL but not the script immediately?

See the importing of several hundred mb or few gb db takes HOURS if not days to complete, and I'm trying to program this little extra function that I can call the script

say

htto://www.mysite.com/script/script.php?run=insert

is the importer, you set the vars in the script before uploading. once that's called it runs for hours on the browser or ssh, and if it's on the browser, it times out after a few hours, or the browser crashes because too much output, and if theres not output it times out after like 30-60 seconds usually, even with set_time_limit(0), the browser IE / FIREFOX is set to expire after 30-60-90 seconds of not recieving any data, so usually i keep alive by flushing() a period every import ( for loop )

now I used:

/usr/bin/php /path/to/script/script.php 'run=insert'

to call from ssh, but there is one more person using this script when i finish it, and he hates ssh, he wants to be able to start from http by clicking


htto://www.mysite.com/script/script.php?run=insert

and it just starts, and outputs that it was started and he can close his browser and it goes. I''m hoping that's possible with some combination of ignore_user_abort(1), set_time_limit(1) and whatever you tell me, maybe curl can call the script and end it's http call after 3 seconds and it just runs in the back ground? Is that possible. course curl is more techinical than i'd like to get with this script, takes me hours to configure the proper curl, but would something like this do what I want?

:

Code: Select all

$ch = curl_init();

	curl_setopt($ch, CURLOPT_URL, http://script_url.com/script/script.php?run=insert);
	curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0); // ZERO IS RIGHT YES?
	curl_setopt($ch, CURLOPT_TIMEOUT, 10); // HAVE CURL DROP THE CALL AFTER 10 seconds
	curl_setopt($ch, CURLOPT_VERBOSE, 1);
	curl_setopt($ch, CURLOPT_HEADER, 0);
	curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0);
And that uses http still to serve the connection huh? I'm trying to get the connection away from http as much as possible because it crashes http and then the import stops....

So.....

Disabled_functions are nothing. Host is whm/cpanel, safe mode off i think, maybe on- on his, php suexec i think is on his but off mine....

Would I be able to do:
system("/usr/bin/php /path/to/script.php 'run=insert'");

in the http script to have it call the insert db and begin importing, and then close my browser and it's not using http period and still running in the back ground?

I don't want to try crap without asking, lol... live server i'm on, no test servers atm...
User avatar
pickle
Briney Mod
Posts: 6445
Joined: Mon Jan 19, 2004 6:11 pm
Location: 53.01N x 112.48W
Contact:

Post by pickle »

You could put it in a cron job. That will make the root user run it without any interaction needed on your part.
Real programmers don't comment their code. If it was hard to write, it should be hard to understand.
Post Reply