Hi guys,
My site will consist of several pages, all of which need to be updated every (say) four hours. The pages will consist of (say) 100 links, randomly generated.
It would really be easier for me to use a flat file for the link database; that file may contain 5,000 URLs. I would have a file for each category (each page).
The script will simply take (say) 100 random links from the flat file and create the HTML file for each page.
The only problems I see are:
1) the link database files can get rather large
2) writing the new HTML file while a lot of people are hitting the page
What do you guys think ? Did I miss anything ? Will it work ?
Thank you,
Robert
Site Architecture/Maintainence Ideas ?
Moderator: General Moderators
I think it may work. 5000 addresses means 5000 lines which could take some time to loop. Maximum execution time is normally 30 seconds and this is no problem. Problem is max_patience_time of average people browsing your site. That time is approximately 5 microseconds as you all must have noticed.
You could split your categoryfiles into smaller, say 1000 lines, randomly pick one and print random 100. Then again, I'm not sure if this is any faster.
I'm thinking something like this
You could split your categoryfiles into smaller, say 1000 lines, randomly pick one and print random 100. Then again, I'm not sure if this is any faster.
I'm thinking something like this
Code: Select all
PICK category
SWITCH (category)
CASE 1: folder=categoryfolder
// for all categories
END SWITCH
READ files in folder into filearray
SHUFFLE filearray
PICK file from filearray
READ lines in file into linearray
SHUFFLE linearray
PRINT 100 first items in linearray- John Cartwright
- Site Admin
- Posts: 11470
- Joined: Tue Dec 23, 2003 2:10 am
- Location: Toronto
- Contact: