Best way to store 14,000 user tree?

Questions about the MySQL, PostgreSQL, and most other databases, as well as using it with PHP can be asked here.

Moderator: General Moderators

Post Reply
Ward
Forum Commoner
Posts: 74
Joined: Thu Jul 13, 2006 10:01 am

Best way to store 14,000 user tree?

Post by Ward »

I am working on a massiv back office project, which has a huge users table. Currently, it has about 14,000 users, all of which are recursive. Each user has a userID and parentID. Both fields are int32, unsigned. userID is primary key, parentID is an index. This tree needs to support infinite parent/child levels, although currently it only goes about 20 levels deep. As you can imagine, it is painfully slow to retrieve even part of the tree, much less the entire thing. Using a recursive function, it took about 10 minutes to parse and display the entire tree. I have used the parentID method successfully in the past, but never with this many records. I have been experimenting with the 'nested set' model, where each record has a leftID and rightID. This method is faster for retrieving a small to medium sized piece of the tree, but is not really any faster to parse the entire thing. I also tried storing userID and parentID in it's own table, and having a separate users_details table for all of the other fields (username, password, etc.). Even still, it wasn't much faster. Has anyone ever had a problem like this before? The only other idea I had was to store a pre-built, cached copy of the user tree that is updated every 10 or 15 minutes.
User avatar
Weirdan
Moderator
Posts: 5978
Joined: Mon Nov 03, 2003 6:13 pm
Location: Odessa, Ukraine

Post by Weirdan »

pre-built, cached copy of the user tree that is updated every 10 or 15 minutes.
Or better yet, updated on every tree modification.
Ward
Forum Commoner
Posts: 74
Joined: Thu Jul 13, 2006 10:01 am

Post by Ward »

That's a good point, although it might be tough to capture every modification. I wonder how places like banks, ebay, etc handle such large databases efficiently.
User avatar
Benjamin
Site Administrator
Posts: 6935
Joined: Sun May 19, 2002 10:24 pm

Post by Benjamin »

What is your current database structure. It should not be taking 10 minutes to process 14000 records. Something isn't configured properly or the code needs to be optimized.
ody
Forum Contributor
Posts: 147
Joined: Sat Mar 27, 2004 4:42 am
Location: ManchesterUK

Post by ody »

Ward wrote:That's a good point, although it might be tough to capture every modification. I wonder how places like banks, ebay, etc handle such large databases efficiently.
They employ people that know what there doing when it comes to data handling. Anyway can you provide more details, i.e. what DBMS your runnning, a sample table dump and the QUERY your using would help.
Post Reply