[nycphp-talk] Paging through large result sets
ophir prusak
prutwo at onebox.com
Thu Aug 29 10:47:30 EDT 2002
Hi Mike,
You're on the right track, but it really depends on many different variables.
Can you tell us the specifics of the application in mind ?
If the site doesn't get to much traffic, and the original query is fast
enough for your needs, then who cares if it's not efficient :)
I'd only go with a different solution if the query takes a "long time".
The solution you propose would work, but is it worth the comlexity ?
I have an idea that would work if :
1. Using MySQL 4.X is an option (yes I know it's beta, but it's really
quite stable)
2. You can pump the MySQL and PHP machines with memory
3. You can limit the max number of results (to like a few hundred)
What you could do is:
1. Do the complicated query with a limit of the max number of results
2. read the whole thing into an array
3. just display array entires x to y.
MySQL 4.x has result caching, so re-doing the query will be FAST.
Basically it's more or less the same idea as you had, but instead of
trying to cache the results yourself, you're letting MySQL 4.X do it
for you. Of course there are plenty of other MySQL result cache solutions
out there if MySQL 4.x is not an option, or you need finer control of
the cache.
Hope that helped.
Ophir
p.s.
We've actually done exactly what you describe on one of the projects
where I work (Community Connect Inc).
----
Ophir Prusak
Internet developer
prutwo at onebox.com | http://www.prusak.com/
---- Mike Myers <myersm at optonline.net> wrote:
>
> First: I am a newcomer to MySQL and PHP. So I will be asking many newbie
> questions, which I hope is not a drag!
>
> Environ: Macintosh G4 running OSX, MySQL and PHP installations provided
> by
> the inimitable Marc Liyanage. Webserver is Apache.
>
> Experience: Proficient with UserLand Frontier scripting and webserving;
> moderate experience with perl and javascript. PHP looks pretty
> straightforward.
>
> I have already built a few MySQL databases from scratch, so I have
> some
> experience with the mysql client, mysqladmin, and mysqlimport. Lately
> I have
> also been using the web-based frontend phpMyAdmin.
>
> ----
>
> What are the implementation strategies for allowing a user to page
> through a
> result set that is too large to view in its entirety?
>
> I see that for simple queries (eg. 1 table or 1 join), the LIMIT statement
> is the easy solution, which entails re-running the SQL query as the
> user
> browses.
>
> But what if the query has multiple joins? It seems inefficient to re-run
> the
> query each time. If I want to cache the original result in a new MySQL
> table, then I have to manage that on a per-user basis. This implies
> using
> cookies or session ID to track the user.
>
> This also suggests I need to code a separate process that periodically
> drops
> these temporary result tables after a defined time has passed. Thus,
> I need
> to continually track the time of last access to each temp table. That
> data
> could go in another MySQL table, or a file of my own convention.
>
> Whew! There are probably other aspects of this arrangement that require
> management code.
>
> Am I on the right track here, or am I making it harder than it needs
> to be?
>
> -- mike myers
>
>
>
>
More information about the talk
mailing list