Hi,
I'm working on an app where I need to download data from multiple hosts. I'm currently doing this sequentially which makes it take a long time. Instead, I would want to thread every download and then gather whatever data I have gotten within a certain time and present that. I can think of some ways of doing this, but it would be great to get some advice on what a best-practice solution is:
1. Do some ugly stuff using the shell. E.g. have a separate script run from the command line that feeds the data downloaded into a database. Have the main thread initiate these scripts in the background (e.g. system()), sleep the main-thread for X milliseconds then retrieve whatever data is in the database. All other scripts considered to have timed out.
2. Thread every data retrieval asynchronously. Is there any support for this? Is http://us3.php.net/manual/en/book.pcntl.php the solution, or are there other libraries that are better?
3. TBD
Threading would by far be the preferred solution, but I'm not familiar enough with PHP to figure out if it is the way to go.
Thanks!
Best,
beetree
Threading advice
Moderator: General Moderators
Re: Threading advice
Noone? I take that as threading in PHP not being that used and that I probably should use some non-PHP threading solution...?
- Jonah Bron
- DevNet Master
- Posts: 2764
- Joined: Thu Mar 15, 2007 6:28 pm
- Location: Redding, California
Re: Threading advice
PCNTL isn't really "threading". It duplicates the currently running program into another process; therefore there's a lot of overhead. Looking at the documentation though, it appears that you can fork the download processes, and then wait for them with pcntl_waitpid(). When it returns, you can continue.
- John Cartwright
- Site Admin
- Posts: 11470
- Joined: Tue Dec 23, 2003 2:10 am
- Location: Toronto
- Contact:
Re: Threading advice
cURL supports concurrent requests. See http://www.php.net/manual/en/function.c ... i-init.php