Discussion of testing theory and practice, including methodologies (such as TDD, BDD, DDD, Agile, XP) and software - anything to do with testing goes here. (Formerly "The Testing Side of Development")
I ran a few tests on the demo and it seems to work. It even worked when I put a dodgy image through a couple of filters. I'm actually very impressed. The only 'problem' with it is that it's PHP 4 code. That's not really ideal, but it's not a big issue really.
How the hell would something like this work? What is considered an adult image?
My only guess is that it works by having some predefined pallette of skin tones and then seeing if a certain percent of the image is made up of those colours.
jayshields wrote:How the hell would something like this work? What is considered an adult image?
My only guess is that it works by having some predefined pallette of skin tones and then seeing if a certain percent of the image is made up of those colours.
It correctly flagged an image I'd converted to black and white...
I'm not so confident in this product. I'm happy to say it is not an easy task, not sure I know where to even begin, but of the handful of quite clearly "not clean" pics I have just tried, it did not correctly identify any of them.
Using this on a production scale could offend more people by denying legitimate images, it could work in practice if it was more "trainable" with different thresholds, and you only had it notify an admin.. used in conjunction with pre-moderation it could cut time out of the moderators schedule. I don't see it eliminating the need for moderation any time soon, especially because it only takes a sarcastic kid like me to sit there and try different images until I bypass the filter