I have a logfile that i need to do a search on, so i whip out a php program to do it. the logfile is just a tad over 1GB (ouch!) with over 10 millions lines. The problem that i have is that for some reason fgets() stops reading at line 2,578,845... also, after we chopped the logfile up into smaller files, the exact same code works perfectly on the smaller files, if i search them one at a time.
thoughts, comments?
limitations on file pointer in php?
Moderator: General Moderators
i just can't imagine how it can be all that efficient to have a 1GB log file.... my only suggestion would be to look into logrotate. i don't know the exact limitations of fgets(), but logrotate would seem to make things more managable.
(for some reason, i can't find logrotate on freshmeat, but it shouldn't be hard to find on google)
(for some reason, i can't find logrotate on freshmeat, but it shouldn't be hard to find on google)