Page 1 of 1
Parsing a directory and its files
Posted: Fri May 20, 2005 3:55 pm
by mtoneil
I'm having a hard time finding any resources on the net for writing a script that parses a directory and its subdirectories and the files within them for display.
In other words, I want to be able to show a tree of folders and files from a certain "root". Is this possible a) for directories and files residing on the webserver and/or b) for directories and files residing on a remote machine (file server)?
Thanks in advance.
Posted: Fri May 20, 2005 4:04 pm
by Burrito
you can not access folder stuctures on remote machines. For your local machine however, you can use a function like this (I found this function last year sometime (prolly on this forum), but I can't remember where nor who wrote it...if it's yours, sorry there's not credit given)...
Code: Select all
// use recurseive copy function to copy all old course contents to new course contents
function copyr($source, $dest){
// Simple copy for a file
if (is_file($source)) {
return copy($source, $dest);
}
// Make destination directory
if (!is_dir($dest)) {
mkdir($dest);
}
// Loop through the folder
$dir = dir($source);
while (false !== $entry = $dir->read()) {
// Skip pointers
if ($entry == '.' || $entry == '..') {
continue;
}
// Deep copy directories
if ($dest !== "$source/$entry") {
copyr("$source/$entry", "$dest/$entry");
}
}
// Clean up
$dir->close();
return true;
}
copyr("w:/inetpub/wwwroot/content/".$_POST['course_id'], "w:/inetpub/wwwroot/content/".$ncourseid);
obviously my utilization of the function is to copy everything with a folder (and its subs), but you could modify it to display or whatever you want.
Edit: just reread your post and thought that you probably could get a file/folder list of a remote server using FTP, but I don't know if you could get something exhaustive that recurses all of the subs (unless you saved the results to an array or something)???
Posted: Fri May 20, 2005 5:22 pm
by timvw
the get a list through ftp is a valid option..
using the ssh extension and performing some ls or find commands is an option too..
Because all the hosts i wanted to list files from were supporting php, i decided to add a little lister.php on them. This way, the script could easily validate the credentials/originating ip, and then return the contents of the directory in a xml list.
The main website then only needed to know the url for each host (list). And i could easily generate a file list based on the returned xml.