Reality of PHP projects...
Posted: Tue Oct 14, 2003 10:52 am
Hi all,
I am generally new to PHP and web-based programming on the practical scale. Let me explain where I am coming from. In school about three years before I learnt ASP, and finally coming over to PHP now because I find it, well, more C-like.
One thing that has been bothering me is the scope of real-world web projects. At school, projects are graded on functionality - rarely would the tutors care about execution speed or data storage optimisation (unless it's really that bad it stands out like a sore thumb). So for those projects the code is requried to work only for one platform and the database only contains only a few test records.
I am wondering at how much data could a SQL server hold and at what point would queries become too long. It's a bit...distrubing to think over it as I have no real-world experience to compare to, now that I am working freelance (!) on my first PHP commerical project.
For example, the project I am involved in now is an online quiz hub where users can select from a number of quiz. Say each quiz consist of twenty questions, and there are 20 quizes. So far, the data load doesn't seem to be a problem. There are about 20 x 20, or 400 questions to store. Now, a user come along and take one quiz. I would have to set up another table to capture that result, which mean I will need room to hold 20 answers if that user takes just one quiz, or room for 400 answers if the user takes 20 quizes. Let say I have 100 users who takes all the quizes and answer all the questions...there would be...like 40000 entries in the answers database (yikes!)
Is this how big a SQL table used in commerical project could get? Would be it faster (or more practical) to have one table per user and to set up a master table which just stores the table name and the user-id? Is SQL queries already the most 'optmised' way of retriving data? The speed of a query is influenced more by SQL, the programming or the server?
Any advices would be appreciated. Many thanks in advance!
I am generally new to PHP and web-based programming on the practical scale. Let me explain where I am coming from. In school about three years before I learnt ASP, and finally coming over to PHP now because I find it, well, more C-like.
One thing that has been bothering me is the scope of real-world web projects. At school, projects are graded on functionality - rarely would the tutors care about execution speed or data storage optimisation (unless it's really that bad it stands out like a sore thumb). So for those projects the code is requried to work only for one platform and the database only contains only a few test records.
I am wondering at how much data could a SQL server hold and at what point would queries become too long. It's a bit...distrubing to think over it as I have no real-world experience to compare to, now that I am working freelance (!) on my first PHP commerical project.
For example, the project I am involved in now is an online quiz hub where users can select from a number of quiz. Say each quiz consist of twenty questions, and there are 20 quizes. So far, the data load doesn't seem to be a problem. There are about 20 x 20, or 400 questions to store. Now, a user come along and take one quiz. I would have to set up another table to capture that result, which mean I will need room to hold 20 answers if that user takes just one quiz, or room for 400 answers if the user takes 20 quizes. Let say I have 100 users who takes all the quizes and answer all the questions...there would be...like 40000 entries in the answers database (yikes!)
Is this how big a SQL table used in commerical project could get? Would be it faster (or more practical) to have one table per user and to set up a master table which just stores the table name and the user-id? Is SQL queries already the most 'optmised' way of retriving data? The speed of a query is influenced more by SQL, the programming or the server?
Any advices would be appreciated. Many thanks in advance!