What are the main ways to speed up a website?
I'm on a shared server and I notice it slowing to a crawl tonight. I don't have that much traffic as the site is not public, but each page is roughly 100k in size.
Would moving to a dedicated box be the #1 speed improvement option?
Any other techniques I should try?
Website speed optimization techniques
Moderator: General Moderators
- Buddha443556
- Forum Regular
- Posts: 873
- Joined: Fri Mar 19, 2004 1:51 pm
Going dedicated will help if your on a typical shared host. On a dedicated server there are a number of things you can do to improve preformance too. Most shared hosts are still using MySQL 3.x if you install MySQL 4.01 or better you can use the query caching feature. You will also be able to install a PHP byte code cache like APC. These should help.
Check out some of the slide show here (ONLY visit with Mozilla they don't like IE) http://talks.php.net/index.php/PHP .
However, if you follow Feyd's advice you may not even need a dedicated server.
Check out some of the slide show here (ONLY visit with Mozilla they don't like IE) http://talks.php.net/index.php/PHP .
However, if you follow Feyd's advice you may not even need a dedicated server.
As a first step, I eliminated superfluous comments and whitespace in header, home and footer html and that reduced load times by 20%!
The downside is that the script is now illegible, so I'll have to remember to keep a copy for editing.
I'm sure most of you have used this before, but the web page analyzer at http://www.websiteoptimization.com is quite useful.
The downside is that the script is now illegible, so I'll have to remember to keep a copy for editing.
I'm sure most of you have used this before, but the web page analyzer at http://www.websiteoptimization.com is quite useful.
- feyd
- Neighborhood Spidermoddy
- Posts: 31559
- Joined: Mon Mar 29, 2004 3:24 pm
- Location: Bothell, Washington, USA
the size of the script shouldn't matter as much as the html it's generating.. so I'm hoping that's what you meant..
Unless your script includes hundreds of files, the script itself shouldn't take long to run.. it'd be more with the server having a coffee stiring straw for bandwidth... of course, if the script is running lots of repeated operations, then that could use some optimization as well. 
IC... perhaps the figures indicating improvement are a mirage
This seems to be one of the slowest algorithms in my script. It's called the great circle calculation, which I use to find zipcodes within a circumference. Has anyone used a faster method?
My query is as follows:
їcode]
" $earthmiles * ATAN(SQRT(1-POW((sin($radlat1)*sin(latitude/$radfactor) + (cos($radlat1)*cos(latitude/$radfactor)*cos((longitude/$radfactor)-$radlong1))) , 2))/
(sin($radlat1)*sin(latitude/$radfactor) + (cos($radlat1)*cos(latitude/$radfactor)*cos((longitude/$radfactor)-$radlong1)))) ";ї/code]
where
їcode]
$radfactor=180/pi;
$earthmiles=3963;
$radlat1=$latitude/$radfactor;
$radlong1=$longitude/$radfactor;ї/code]
This seems to be one of the slowest algorithms in my script. It's called the great circle calculation, which I use to find zipcodes within a circumference. Has anyone used a faster method?
My query is as follows:
їcode]
" $earthmiles * ATAN(SQRT(1-POW((sin($radlat1)*sin(latitude/$radfactor) + (cos($radlat1)*cos(latitude/$radfactor)*cos((longitude/$radfactor)-$radlong1))) , 2))/
(sin($radlat1)*sin(latitude/$radfactor) + (cos($radlat1)*cos(latitude/$radfactor)*cos((longitude/$radfactor)-$radlong1)))) ";ї/code]
where
їcode]
$radfactor=180/pi;
$earthmiles=3963;
$radlat1=$latitude/$radfactor;
$radlong1=$longitude/$radfactor;ї/code]