Last in, First out, csv

PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!

Moderator: General Moderators

Post Reply
shaitand
Forum Newbie
Posts: 10
Joined: Sun Jul 29, 2007 9:54 am

Last in, First out, csv

Post by shaitand »

I am writing a simple download tracker script.

This script keeps a log of downloads with a date stamp. But I don't want to log repeat downloads in the same day.

My problem is that I simply can't find any combination of php file operations that allows first in first out reading of a text file. If I use fputs and fgets or similar then I will have to parse the entire log to get to todays entries which will be at the end of the file. I can't seem to find a way that will let me start parsing the file line by line from the end of the file rather than the beginning. As an alternative appending new entries at the beginning of the file would work but there doesn't seem to be a way to do that either.

There has to be a simple way to do this that I am missing. The only things I can find would require reading the entire file into memory or another file and that would be the slowest possible method. The whole idea is that newer log entries will be accessed more frequently and should be the first parsed, hopefully avoiding parsing the majority of the file.
Last edited by shaitand on Sun Jul 29, 2007 11:25 am, edited 1 time in total.
User avatar
superdezign
DevNet Master
Posts: 4135
Joined: Sat Jan 20, 2007 11:06 pm

Post by superdezign »

Well, a database would make more sense as you could just limit the amount that you take in at a time...

Why do you just append to the beginning of the file instead of the end of the file, that way you can just get a certain amount of lines and it'd already be in order?
shaitand
Forum Newbie
Posts: 10
Joined: Sun Jul 29, 2007 9:54 am

Post by shaitand »

superdezign wrote:Why do you just append to the beginning of the file instead of the end of the file, that way you can just get a certain amount of lines and it'd already be in order?
I'd love to. That's why I asked how :wink:
User avatar
superdezign
DevNet Master
Posts: 4135
Joined: Sat Jan 20, 2007 11:06 pm

Post by superdezign »

You could always do a file_get_contents() or fread() (I think fread as well) to get the whole file, then append the new line to the start, and then frwite() the whole thing.

I still recommend using a database. Otherwise, you'll run into times when the same file is trying to be accessed from more than one request.
shaitand
Forum Newbie
Posts: 10
Joined: Sun Jul 29, 2007 9:54 am

Post by shaitand »

superdezign wrote:You could always do a file_get_contents() or fread() (I think fread as well) to get the whole file, then append the new line to the start, and then frwite() the whole thing.
That would defeat the whole point since parsing more of the file than neccesary is what I'm trying to avoid.

I appreciate you taking the time to help, I really do. But I have already considered these options and said as much in my initial post.
superdezign wrote:I still recommend using a database. Otherwise, you'll run into times when the same file is trying to be accessed from more than one request.
I'm using file locking to take care of that, it isn't perfect but it will do for this. Speed is my primary concern and I'd like to avoid the overhead of using a more complex database (technically this is a CSV flat file database).
User avatar
s.dot
Tranquility In Moderation
Posts: 5001
Joined: Sun Feb 06, 2005 7:18 pm
Location: Indiana

Post by s.dot »

Get it into an array, and array_unshift() it, to add an element onto the beginning. You can check for duplicates by using array_unique(), and you can sort the array any way you want using any of the array sorting functions like asort(), sort(), arsort(), etc.
Set Search Time - A google chrome extension. When you search only results from the past year (or set time period) are displayed. Helps tremendously when using new technologies to avoid outdated results.
shaitand
Forum Newbie
Posts: 10
Joined: Sun Jul 29, 2007 9:54 am

Post by shaitand »

scottayy wrote:Get it into an array, and array_unshift() it, to add an element onto the beginning. You can check for duplicates by using array_unique(), and you can sort the array any way you want using any of the array sorting functions like asort(), sort(), arsort(), etc.
You would have to get read the entire file to put it into an array though and that is what I am trying to avoid. :wink:

BUT, I found the answer I was looking for. It was so obvious it smacked me upside the head. Using ssprintf I was able to pad my entries with spaces so that they are all fixed length. At that point I can calculate the location of any entry in the file and fseek right to it. So I just wrote a function that jumps to the end of the file and steps back one record at a time.

If anyone else is hitting a blank on this here is the function I wrote, I used fgets() to actually read the records because I terminate them all with newlines but fread() should work (I'm no PHP guru so I'm sure there is room for improvement).

Code: Select all

#Get last record in file.
function getLastRec($handle,$recsize){
	if(ftell($handle) === FALSE){
		 return FALSE;
	}
	$pos = ftell($handle);
	if($pos == 0){
		fseek($handle,1,SEEK_END);
		fgets($handle);
		return FALSE;
	}
	fseek($handle,$recsize,SEEK_CUR);
	#Save position
	$pos = ftell($handle);
	
	#Read data
	$buff=fgets($handle);

	#Move cursor back to the start of the record.
	fseek($handle,$pos,SEEK_SET);

	return $buff;
}
The records are made with:

Code: Select all

$logentry = date("d,m,y,") . $ipaddress . "\n";
	$logentry = sprintf("%42s",$logentry);
	fputs($log, $logentry);
shaitand
Forum Newbie
Posts: 10
Joined: Sun Jul 29, 2007 9:54 am

Oops

Post by shaitand »

Sorry, just realize I copy and pasted from the wrong file. That function has a showstopper bug in it.

Code: Select all

#Get last record in file.
function getLastRec($handle,$recsize){
	if(ftell($handle) === FALSE){
		 return FALSE;
	}
	$pos = ftell($handle);
	if($pos == 0){
		fseek($handle,1,SEEK_END);
		fgets($handle);
		return FALSE;
	}
	$recsize -= ($recsize*2);
	fseek($handle,$recsize,SEEK_CUR);
	#Save position
	$pos = ftell($handle);
	
	#Read data
	$buff=fgets($handle);

	#Move cursor back to the start of the record.
	fseek($handle,$pos,SEEK_SET);

	return $buff;
}
User avatar
superdezign
DevNet Master
Posts: 4135
Joined: Sat Jan 20, 2007 11:06 pm

Post by superdezign »

And out of curiosity, is that faster than getting the file contents, appending to it, and rewriting the file?
shaitand
Forum Newbie
Posts: 10
Joined: Sun Jul 29, 2007 9:54 am

Post by shaitand »

superdezign wrote:And out of curiosity, is that faster than getting the file contents, appending to it, and rewriting the file?
Drastically. The bigger the file, the more drastic the difference.
User avatar
nathanr
Forum Contributor
Posts: 200
Joined: Wed Jun 07, 2006 5:46 pm

Post by nathanr »

dosn't seem too hard.. lets say you wanted to get the last 24kb of the file..

filesize() the file..
remove 24kb from the files size to get the $StartPoint
fopen the file
fseek() to the $startpoint
fgets the rest of the file
explode or preg_match the results on \n
trim out the first one, as it's probably only half a line
there you go.. a tail :)
User avatar
nathanr
Forum Contributor
Posts: 200
Joined: Wed Jun 07, 2006 5:46 pm

Post by nathanr »

apologies, I've had this window open for hours while I was at the shop, meant to post earlier, forgot to refresh to see the convo so far before posting.. looks liek you got to the same decision as I did!
Post Reply