Every so often I see a continual string of page requests from the same IP which look like:
//phpMyAdmin-2.5.5-rc1/scripts/setup.php
//phpMyAdmin-2.6.0-rc3/scripts/setup.php
//phpMyAdmin-2.6.1-rc3/scripts/setup.php
...
I'm not sure how phpmyadmin is actually configured on my host so I was wondering if there is something here I should be concerned with. Whatever these tools are looking for I'll get about 200 variations of version number and rc numbers, so it seems like it could be something vulnerable otherwise why bother? (FYI in my Cpanel setup, I would assume there are no phpmyadmin files actually in the hosted file area because I don't see anything in the public_html. I also assume Cpanel has some other port that phpmyadmin is severed over because that is how I access it)
Why the extra leading /? Normal page requests are usually proceeded by a single. Is this a hacker who has a typo?
I also see things like
//db/scripts/setup.php
//myadmin/scripts/setup.php
//mysql/scripts/setup.php
//pma/scripts/setup.php
I assume these are all packages with known exploits or something, but I'm not familar with them because everything is hand coded.
Other than sucking my bandwidth is there anything I should be worried about?
phpMyadmin and other automated page requests
Moderator: General Moderators
- mecha_godzilla
- Forum Contributor
- Posts: 375
- Joined: Wed Apr 14, 2010 4:45 pm
- Location: UK
Re: phpMyadmin and other automated page requests
I'm not sure why the double slashes (//) are used, but it might be worthwhile checking to see what the user-agent for the requests are and blocking access based on this - this obviously isn't fool-proof but it might deter the ubiquitous 'script kiddies' that don't understand what's going on.
I have the following at the top of my .htaccess file saved in the web root:
SetEnvIfNoCase User-Agent "libwww-perl" possible_site_compromise
SetEnvIfNoCase User-Agent "Indy Library" possible_site_compromise
Deny from env=possible_site_compromise
It might also be possible to use wildcards but I haven't tried this (a user-agent that's blank would also be suspicious).
You could also deny requests by IP as well of course (from experience, most attempts I've seen have come from open proxy servers, though there are blacklists available that you could use). How phpMyAdmin is configured depends on whether you're using a shared hosting service or not; the shared one that I use has a completely different address for PMA than the domain name used for my site. However, on a dedicated server then it will probably be accessible at http://www.mydomain.com/phpmyadmin/ or something very similar.
I don't think there's anything you can do to stop people trying to query what files you've got on your server though. The reason why the different version numbers of phpMyAdmin are being tested is because script kiddies will often find details about an exploit on one of the many 'security' sites out there and then plug these values into Google (e.g. searching for "Made with UnSafeApp v1.2"), with the alternative being to run a script that requests the URLs directly.
HTH,
Mecha Godzilla
I have the following at the top of my .htaccess file saved in the web root:
SetEnvIfNoCase User-Agent "libwww-perl" possible_site_compromise
SetEnvIfNoCase User-Agent "Indy Library" possible_site_compromise
Deny from env=possible_site_compromise
It might also be possible to use wildcards but I haven't tried this (a user-agent that's blank would also be suspicious).
You could also deny requests by IP as well of course (from experience, most attempts I've seen have come from open proxy servers, though there are blacklists available that you could use). How phpMyAdmin is configured depends on whether you're using a shared hosting service or not; the shared one that I use has a completely different address for PMA than the domain name used for my site. However, on a dedicated server then it will probably be accessible at http://www.mydomain.com/phpmyadmin/ or something very similar.
I don't think there's anything you can do to stop people trying to query what files you've got on your server though. The reason why the different version numbers of phpMyAdmin are being tested is because script kiddies will often find details about an exploit on one of the many 'security' sites out there and then plug these values into Google (e.g. searching for "Made with UnSafeApp v1.2"), with the alternative being to run a script that requests the URLs directly.
HTH,
Mecha Godzilla
Re: phpMyadmin and other automated page requests
Well, I have a script that tracks all the 404 error and I've setup some code to block IP's that request certain known bad links like "//mysql-admin/scripts/setup.php" for 24 hours. However I haven't turned on the blocking yet. I've been just watching to see who/where/what is going on.
The user agent is usually blank and the IPs that run the scans usually don't come back again. Their requests are rapid, probably about as fast as my host can serve them. I've been thinking of just throttling back any IP that requests multiple bad links within a short time span.
My main concern of course is if I'm missing something that they are finding.... Do you recognize the tool they might be running? I would like to try it out myself. Some of the versions of phpmyadmin they are searching for are ancient, so it leads me to believe it's just some scripting tool. I've been reading exploits on these things and not really finding much relative to my setup.
I could do a user agent filter too. That might be a good idea, but if they know what they are doing it is pretty easy to spoof.
The user agent is usually blank and the IPs that run the scans usually don't come back again. Their requests are rapid, probably about as fast as my host can serve them. I've been thinking of just throttling back any IP that requests multiple bad links within a short time span.
My main concern of course is if I'm missing something that they are finding.... Do you recognize the tool they might be running? I would like to try it out myself. Some of the versions of phpmyadmin they are searching for are ancient, so it leads me to believe it's just some scripting tool. I've been reading exploits on these things and not really finding much relative to my setup.
I could do a user agent filter too. That might be a good idea, but if they know what they are doing it is pretty easy to spoof.
- mecha_godzilla
- Forum Contributor
- Posts: 375
- Joined: Wed Apr 14, 2010 4:45 pm
- Location: UK
Re: phpMyadmin and other automated page requests
As you mention, denying access to your server based on information such as the user-agent has limited usage but might be a useful way to "weed out the bozos" 
Speaking as someone who works with PHP and Perl fairly regularly, it's relatively easy to write a script that automates web requests but it's equally easy to download these scripts - either from a 'security' site or from the hacking forums. What I would bear in mind is that even though the versions of phpMyAdmin may look old, that particular version might be being used because it's compatible with the particular PHP/MySQL versions that somebody has installed on their system, remembering that some people/companies prefer to deploy stable (for which read "older") versions of some OSs, Debian and CentOS being two particular cases. As a case in point, the dedicated server that I'm working on and has just been set up is running v5.1.6 of PHP...
What's more of a problem are the open proxies - these can easily be exploited to mask the identity of the culprit - but there are blacklists available so I suggest you might want to look into this (trying Googling "open proxy blacklist" or similar).
I think the way to minimise attacks on your site is to firstly make sure you know exactly what services are running on your server (you can use chkconfig for this on Linux, as well as ps and netstat) and use non-standard port numbers for any services you need to keep private e.g. SSH (obviously update your firewall settings if you do this). If you use software to confuse and restrict port scanners, then you can minimise the likelihood that someone will start trying to hack your Apache install or MySQL database. You can also alter settings in the Apache and PHP config settings to make sure they don't report their particular version numbers when queried. Finally, if you do need to run something like phpMyAdmin you could make it available from a non-standard directory or put a .htaccess file in place as an extra deterrent to stopping someone from getting access to the login screen.
HTH,
M_G
Speaking as someone who works with PHP and Perl fairly regularly, it's relatively easy to write a script that automates web requests but it's equally easy to download these scripts - either from a 'security' site or from the hacking forums. What I would bear in mind is that even though the versions of phpMyAdmin may look old, that particular version might be being used because it's compatible with the particular PHP/MySQL versions that somebody has installed on their system, remembering that some people/companies prefer to deploy stable (for which read "older") versions of some OSs, Debian and CentOS being two particular cases. As a case in point, the dedicated server that I'm working on and has just been set up is running v5.1.6 of PHP...
What's more of a problem are the open proxies - these can easily be exploited to mask the identity of the culprit - but there are blacklists available so I suggest you might want to look into this (trying Googling "open proxy blacklist" or similar).
I think the way to minimise attacks on your site is to firstly make sure you know exactly what services are running on your server (you can use chkconfig for this on Linux, as well as ps and netstat) and use non-standard port numbers for any services you need to keep private e.g. SSH (obviously update your firewall settings if you do this). If you use software to confuse and restrict port scanners, then you can minimise the likelihood that someone will start trying to hack your Apache install or MySQL database. You can also alter settings in the Apache and PHP config settings to make sure they don't report their particular version numbers when queried. Finally, if you do need to run something like phpMyAdmin you could make it available from a non-standard directory or put a .htaccess file in place as an extra deterrent to stopping someone from getting access to the login screen.
HTH,
M_G
Re: phpMyadmin and other automated page requests
Unless you have a multi-server load-managed server cluster (considerable hardware infrastructure), you do not want to start going down the road of blacklists. It may very well become more resource consuming than the requests themselves.
You end up blocking many valid requests becasue the majority of the request will be coming from dial-up and some dsl internet service providers who have floating IP systems (and proxy services), whereby the subscribers ip changes every time they connect. This means that when they "hang-up" or relog in after they turn their computer off and on, their ip will change and the previous ip they used will be "recycled" to another quite valid user.
Additionally, some major isp's use proxies in a very legitimate fashion, so this will also give false posatives, blocking legitimate users. I had a customer a few years back who was complaining their hits decreased dramatically after they started banning some abusers of their product forums. They had blocked AOL's ip which used a proxy, therefore eliminating AOL subscribers.
Lastly, the sheer size of these blacklist files can grow to enormous proportions and can noticably decrease response and display times and overall server performance can really slow down just parsing them for every request, especially if importing abuse cached lists from 3rd party sources.
I view security as anti-intrusion, data protection, and privacy protection... ignore the logs unless they are causing some type of service interruption or insecurity, else you will be spending way to much wasted time and lose valid visitors in the process.
Banning IP's as a defense, for the most part, is an antiquated unreliable relic.
It takes your server almost no resources to return 404 headers unless your server is being targeted for a DoS or DDoS with thousands or tens of thousands of simultaious requests.
You end up blocking many valid requests becasue the majority of the request will be coming from dial-up and some dsl internet service providers who have floating IP systems (and proxy services), whereby the subscribers ip changes every time they connect. This means that when they "hang-up" or relog in after they turn their computer off and on, their ip will change and the previous ip they used will be "recycled" to another quite valid user.
Additionally, some major isp's use proxies in a very legitimate fashion, so this will also give false posatives, blocking legitimate users. I had a customer a few years back who was complaining their hits decreased dramatically after they started banning some abusers of their product forums. They had blocked AOL's ip which used a proxy, therefore eliminating AOL subscribers.
Lastly, the sheer size of these blacklist files can grow to enormous proportions and can noticably decrease response and display times and overall server performance can really slow down just parsing them for every request, especially if importing abuse cached lists from 3rd party sources.
I view security as anti-intrusion, data protection, and privacy protection... ignore the logs unless they are causing some type of service interruption or insecurity, else you will be spending way to much wasted time and lose valid visitors in the process.
Banning IP's as a defense, for the most part, is an antiquated unreliable relic.
It takes your server almost no resources to return 404 headers unless your server is being targeted for a DoS or DDoS with thousands or tens of thousands of simultaious requests.
Re: phpMyadmin and other automated page requests
Yeah, I agree and haven't really bothered to turn it on. It was just a brainstorm defense I thought up. Disappointing to hear you call it a relic when I just thought it up.
The user agent filter wouldn't work either for similar reasons, not including the fact that many appear as valid user agents. I was thinking of just throttling them back if I get say 60 requests from the same IP in a minute. But I haven't found a good way to do this in PHP yet. The best place would be on the server itself, but I just have a hosted site.
I'd really just like to know what tool they are using. I guess I'll have to try to get backtrack running and look for something there there to test it against my site. It seems like each person doing this is pulling links from a standard database.
I don't think phpmyadmin can be accessed directly via the httd service on my host site. I think cpanel serves it through a port. But I've never really looked closely at it. Perhaps when I get to a real internet connection I'll have to sit down and research this some more.
The user agent filter wouldn't work either for similar reasons, not including the fact that many appear as valid user agents. I was thinking of just throttling them back if I get say 60 requests from the same IP in a minute. But I haven't found a good way to do this in PHP yet. The best place would be on the server itself, but I just have a hosted site.
I'd really just like to know what tool they are using. I guess I'll have to try to get backtrack running and look for something there there to test it against my site. It seems like each person doing this is pulling links from a standard database.
I don't think phpmyadmin can be accessed directly via the httd service on my host site. I think cpanel serves it through a port. But I've never really looked closely at it. Perhaps when I get to a real internet connection I'll have to sit down and research this some more.