Page 1 of 1

Pagination and DB question

Posted: Mon Oct 11, 2004 2:02 pm
by Calimero
I use some kind of a class for pagination of my results which is "called" this way:

$mysql = mysql_query("SOME QUERY", 10);
number 10 is the number of reccords thatI want to display per page.


My question is:
With pagination of the results - is the DB (mysql) every time accessed when you hit next page > or page 3, 4 or so.... or are the results some how cached on the client side - so that my DB is queried only once - at the initial query.

Anyone ?

Posted: Mon Oct 11, 2004 2:35 pm
by potsed
you could use a session/cookie to hold an array of the results

Posted: Mon Oct 11, 2004 3:13 pm
by John Cartwright
The common way to do it is with a LIMIT $min, $max which are changed when the pages are changed, therefor a new call is required from the database.

...

Posted: Tue Oct 12, 2004 5:07 am
by Calimero
session / cookie - this will be lots of data - will this not have some undesired effects on the clients computer ?

Posted: Tue Oct 12, 2004 5:18 am
by phpScott
you could create your own javascript objects that store the data then have your next, prev page numbers go and get the right data and do lots of innerHTML setttings to redisplay the new data. Other wise you will have to keep round tripping to the db with the new resutls and as Phenom said use the limit clause in mysql limit $startAtResult, $returnHowMany.

Posted: Tue Oct 12, 2004 5:31 am
by timvw
problem with retrieving all data, is that in many cases you retrieve +10k records where the user will only view let's say 10 * 10.

And it would result in a painfull big session/javascript.

I prefer the get what you need principle, and thus for MySQL that means using the LIMIT clause.

Posted: Tue Oct 12, 2004 6:05 am
by phpScott
i agree timvw but it is an option that if you didn't want to keep roundtripping to the server you could do, i wouldn't recomened it for medium to large sites as the intial download could be very large and time consuming.

phpScott

Posted: Tue Oct 12, 2004 6:06 am
by malcolmboston
phpScott wrote:i agree timvw but it is an option that if you didn't want to keep roundtripping to the server you could do, i wouldn't recomened it for medium to large sites as the intial download could be very large and time consuming.

phpScott
very true, i used this methof and was getting parse times of circa 40 seconds on a 100k item database, it is not an option for large sites

...

Posted: Tue Oct 12, 2004 6:21 am
by Calimero
Ok, me need to clarify this to me self.


If I decide to use sessions or cookies or JS or any client side "technique", can this be done in my example - I limit my qurey results to 100 inside the MySQL statement.

So my question is - can any of these 3 techniques be used in this example - to store data on the client side and paginate it there.

And I didn't quite get the point of the "very big sites - time consuming...".

What point am I missing!?!?

Posted: Sun Oct 17, 2004 6:30 pm
by feyd
So my question is - can any of these 3 techniques be used in this example - to store data on the client side and paginate it there.

And I didn't quite get the point of the "very big sites - time consuming...".

What point am I missing!?!?
I'd generally suggest not storing the information client-side:
  1. it can change while they are looking at it.
  2. it'll take the browser a while to load a large page, since you'll have to send a LOT of data to the browser to create all the client-side scripting needed
  3. The client-side scripting could easily be turned off, and result in functionality breakdown