IRC Bot
Posted: Thu Jun 10, 2004 4:31 am
I have the following specifications for a IRC Bot I would like... I have some idea's but I would love some help in getting this programed.
Here's how the bot moderator works:
1) You register with the bot using your registered nick. No password needed since you have already identified with nickserv. You have to enter the channel with the nick you used with the robot.
2) You then can rate other members in the channel like this:
/msg bot trust mynick
or
/msg bot distrust nick
Depending on your own trust rating in the channel, your trust/distrust will get stronger/weaker.
I will have a trust rating of 100. New users will have a trust rating of 10. Passing on trust must take several factors into consideration:
a) Number of people who trust/distrust you
b) Strength of that trust
c) Length of trust chain
Obviously, if many people with a high trust rating link to you, you will have a higher level of trust. Trust is not accumulative so if ten people with ratings of 90 trust you, you won't get 900 points -- all you get is a 90 rating with enough protection to withstand ten other 90 rated people who distrust you.
Active participants in the channel constitute the working set for voting. If someone is not active, they cannot be affected by a vote. A majority of people voting will enact a devoice or a wildcard ban. The majority refers to trust rating.
So if someone comes in and is extremely problematic, requiring a ban, then robot will be asked to announce a poll, and people msg the bot as follows:
/msg bot ban mynick
/msg bot dont ban nick
People who become active during voting can vote. Those who are newly joined cannot vote.
In addition to the voting mechanism, the bot will also be able to detect flooding behavior, join floods, and other repetitive behavior. The limit will be set very high so the channel will have to endure some abuse, but we do not want to accidently ban someone who is a fast typist.
We'll prefer devoicing to banning unless a repitive offense occurs (ie join floods, nick fooods). Hopefully, chanserv/nickserv can detect this sort of thing and help us out.
Here's how the bot moderator works:
1) You register with the bot using your registered nick. No password needed since you have already identified with nickserv. You have to enter the channel with the nick you used with the robot.
2) You then can rate other members in the channel like this:
/msg bot trust mynick
or
/msg bot distrust nick
Depending on your own trust rating in the channel, your trust/distrust will get stronger/weaker.
I will have a trust rating of 100. New users will have a trust rating of 10. Passing on trust must take several factors into consideration:
a) Number of people who trust/distrust you
b) Strength of that trust
c) Length of trust chain
Obviously, if many people with a high trust rating link to you, you will have a higher level of trust. Trust is not accumulative so if ten people with ratings of 90 trust you, you won't get 900 points -- all you get is a 90 rating with enough protection to withstand ten other 90 rated people who distrust you.
Active participants in the channel constitute the working set for voting. If someone is not active, they cannot be affected by a vote. A majority of people voting will enact a devoice or a wildcard ban. The majority refers to trust rating.
So if someone comes in and is extremely problematic, requiring a ban, then robot will be asked to announce a poll, and people msg the bot as follows:
/msg bot ban mynick
/msg bot dont ban nick
People who become active during voting can vote. Those who are newly joined cannot vote.
In addition to the voting mechanism, the bot will also be able to detect flooding behavior, join floods, and other repetitive behavior. The limit will be set very high so the channel will have to endure some abuse, but we do not want to accidently ban someone who is a fast typist.
We'll prefer devoicing to banning unless a repitive offense occurs (ie join floods, nick fooods). Hopefully, chanserv/nickserv can detect this sort of thing and help us out.