If you already have permission to do this just ignore everything above...
You could possible use erag to find all links and check if they are on the same domain... useing this method you could get a database of all pages on a given page.. then load the list of pages into an array and go through a foreach loop... inside the loop you would request each page and find all links on that page... make the loop go through itself x amount of times where x= the amount of pages deep to go through the site... This is called recursive functions
Code: Select all
<?php
function (loop) {
foreach($a as $key=>$value) {
// Do some code
if ($needtoloop) {
loop('somedata');
} else {
break;
}
}
}
?>See how the function calls itself while $needtoloop is true? Use the same code structure except requesting pages.. this would "spider" the site... then just find all the image tags and use file_get_contents($imageurl); and then fwrite the contents to a local file