Page 1 of 1
concurrent issue
Posted: Mon Feb 19, 2007 9:29 pm
by johnhelen
Hello all
In my php application, users will enter urls and I have to keep the information such as time hit, url...I
I donot want to store those information directly into my database as there may be a high number of hits per second. So, first, I write those information into a temporary file (in the server). Then each 10 minutes, this whole data will be added(insert) into database (using cron). The problem here is concurrent issue. When I am reading the file to store the data from the file into database, someone may update the file and some information may be lost. What is the best way to avoid losing data in this case.
Many thanks for any help
sho
Posted: Mon Feb 19, 2007 9:32 pm
by Kieran Huggins
Just send it to the database right away... that's what it's designed for.
Use a persistent connection like MySQL_pconnect() and the hit will be minimal.
Posted: Mon Feb 19, 2007 9:33 pm
by feyd
It really shouldn't hurt or even slow your database having many inserts happening every second.
Anyway, you could rename the cache file so the old data is moved off.
Thank but
Posted: Mon Feb 19, 2007 9:43 pm
by johnhelen
Thanks
But there may be many hits per second (5000) and sometime no hit at all
Posted: Mon Feb 19, 2007 9:59 pm
by feyd
There will be 5000 per second the moment this goes live? If you are expecting that high of a volume, you should consider a dedicated database server. You will have more problems with concurrency of the file then the database in this regard then.
database is postgres
Posted: Mon Feb 19, 2007 10:07 pm
by johnhelen
We use postgresql database for a mobile system. 5000 is too many but sometimes it goes to 500-1000 hits/second (when there is promotion). That why we donot want to update/add directly into database
Posted: Mon Feb 19, 2007 11:16 pm
by feyd
Likelihood is, you'll encounter concurrency problems with the file system far sooner than with the database at those levels. PHP won't be able to open, append and close the file fast enough.
Any idea
Posted: Tue Feb 20, 2007 2:16 am
by johnhelen
What is your suggestion about this case. I know that memcache can be used to store temporary information. however, I cannot go throught whole items in a memcache so I can save it into database.
Thanks
Posted: Tue Feb 20, 2007 3:14 am
by Kieran Huggins
I still think your db is the way to go - maybe create a temp table in memory (faster performance) and dump it to the regular table every once in a while?
Posted: Tue Feb 20, 2007 11:07 am
by pickle
Ya, dumping right to a database is what you should do. Concurrency is handled by the database then. ~feyd made a good point in that the file system will have more problems with 1000 accesses/second than the database will.
If you need to store data quickly, use a database. If your database can't handle it, you need to use a different database or have a dedicated database box. There are pretty much no other options.
Posted: Tue Feb 20, 2007 7:45 pm
by johnhelen
Thanks all
I will try to see how my current database works first
Posted: Wed Feb 21, 2007 2:26 am
by onion2k
Use delayed inserts.