Page 1 of 1
Looking for best primary key
Posted: Tue Aug 03, 2004 9:49 am
by AGISB
I want to log login attempts. If you do 3 failed logins with the same username the account is disabled for 5 minutes.
Now I got followoing table to log the data in MySQL.
TStamp as timestamp
User as char(15)
IP_Address as char(16)
Session as char(32)
I was thinking to use timestamp as primary as this logfile is not very important. If 2 users do a failed login at the same second I simply would catch the mysql error and don't write the 2nd login attempt.
Somehow I am not satisfied with this as it is a kind off faulty programming.
An auto increment key has to be reseted somehow as it might get big fast in an environment with many short living log entries.
Any Ideas how to solve this best?
Posted: Tue Aug 03, 2004 10:15 am
by kevin7
why don't you create another column to uniquely identified an row...
such as "id" column...
Posted: Tue Aug 03, 2004 12:06 pm
by AGISB
This is what the auto_increment primary key comment in my initial post was about. If I choose a ID I need to reset the autoincrement value sometime as the id will get big very fast as this is a logfile which has its entries deleted within minutes.
Posted: Wed Aug 04, 2004 3:06 am
by AGISB
I think I just add the microsecond part of microtime() to the primary key
This should make it near impossible to do stuff at the same time. In case someone has the same problem here is what I did:
Code: Select all
<?php
$field = explode(" ", microtime());
$micro_sec = $field[0] * 1000000;
?>
This will come up with a medium INT
Posted: Wed Aug 04, 2004 3:29 am
by JAM
Would this actually need a primary key field? You could do a costom version of the "users online" script that is very common out there.
Store user (id or name) along with a timestamp in the database, without primary keys.
Upon each failed login, you "insert into table values('user', 'timestamp')".
Then you "delete from table where timestamp < now()-300" to erase all traces of users that failed login 5 minutes (or more) ago.
Then "select count(user) where user = 'user'" to get the amount of tries he/she have made...
If this is more than 3, well...
Hope I made myself understandable.

User online scripts can be found using google or by browsing the
EvilWalrus site for more ideas.
Posted: Wed Aug 04, 2004 5:08 am
by AGISB
I was actually in the understanding that a MySQL table needs a primary key.
If it doesn't this sure is the best way then. I need to look into that.
Posted: Wed Aug 04, 2004 5:16 am
by AGISB
JAM wrote:
Upon each failed login, you "insert into table values('user', 'timestamp')".
Then you "delete from table where timestamp < now()-300" to erase all traces of users that failed login 5 minutes (or more) ago.
Then "select count(user) where user = 'user'" to get the amount of tries he/she have made...
I am wondering if the
Code: Select all
delete from table where timestamp < now()-300
would give the needed results. Wouldn't now give you a Mysql Timestamp? If so you cannot just substract 300 as this would come up with only 3 minutes instead of 5 as the last 2 digits of that timestamp would never be higher than 59. If even gets worse if you want to do something like 120 secs. If I use 500 it would somehow work but this might have some serious problem in readability of my code.
Posted: Wed Aug 04, 2004 10:23 am
by JAM
I forgive my ignorance, as i thought of the use of another function prior to deleteing/selecting data:
Code: Select all
delete from test where unix_timestamp(timest) < now()-800
By using the mysql function unix_timestamp() (thus converting the timestamp into plain seconds) you have more and easier ways to deal with this issue.
To comment the first post after mine; You actually dont
need it. It has advances in alot of cases, but as you are deleting X rows each time a user is verified, the table shouldn't grow into such large properties that you'd need one.