Page 1 of 1

Any one know how to search faster?

Posted: Wed May 03, 2006 2:01 pm
by Nashtrump
hi everyone...

Im fairly new to PHP and am developing a program that looks for certain things in other websites.

Currently imn using the file_get_contents($link) to look at each url.

This is really slow though... does anyone have any better suggestions to do this?
Im aiming to look at about 5-6 websites each containing around 30-40 urls...

Any help here would be really appreicated!!

Thank you in advance!!

Nash

Posted: Wed May 03, 2006 2:37 pm
by Christopher
Have you Googled "php web spider" to see how others have done it. There are also several very good packages already written to do this.

Posted: Wed May 03, 2006 4:39 pm
by Nashtrump
Hi there,

I am looking for a specific thing to spider so unfortunately for me its a spider that must be manually built. So i cant use any other packages.

I can only find one or two examples of spiders and they both use the file_get_contents.

I just wondered if anyone knew of any other way which was quicker?

thanks

nash

Posted: Wed May 03, 2006 6:29 pm
by Ollie Saunders
well its always going to be limited by the speed of your internet connection and the speed at which the websites can respond.

Have a look at declare(ticks) which may help to address that second bottleneck i mentions by requesting data from multiple sites at once. Although the comments about declare ticks don't sound very inspiring

Posted: Tue May 16, 2006 3:20 am
by sava
file_get_contents($link) retrives all $link content. May be it's better to use the stream and read the $line while you find what you are looking for...

Code: Select all

$fp = fopen($link,'r');
$str = '';
while (!feof($fp) && !substr_count($str,$youSearch)) {
	$str = fread($fp,1024);
}