Page 1 of 1
Setting Limits on Download Bandwidth
Posted: Mon May 17, 2004 1:50 pm
by harleyh
I have an area on my webiste that contains over 1 Gig of medcal research data. I want to allow anyone to view the material, but stop anyone who tries to grab too much in a certain time period.
Question is, for any given IP how can I tell how much data they are downloading?
Performance wise does it make any difference if I put a session cookie on them or simply update a database whenever they download data?
I can handle handle everything here except knowing how much data I am sending them?
Any help on what functions I should be looking at???
THX
Posted: Mon May 17, 2004 2:18 pm
by leenoble_uk
A session would be easier and sufficient I feel.
You would only really need to store the total amount downloaded so far.
$_SESSION['bytes'] = 0;
Then when they click a download link, which would have to be a php script it would get the size of the requested file in bytes and add it to the session variable.
It would take as an argument the filename or an id number
mydownloadscript.php?file=kidneys.pdf
Then it would update the session variable and output a header to redirect to the file
header ("Location: files/kidneys.pdf");
This of course wouldn't take account of whether or not they'd completed each download.
This could easily be circumvented by quiting and relaunching their browser though but most people may not realise that so you'd probably get away with it.
You could set up a db to store the IP addresses and record the time they last downloaded something in the database if you really wanted to prevent this though.
Posted: Mon May 17, 2004 2:23 pm
by harleyh
WOW, that seems easy. I spent hours trying to figure out how to use fsock() or fpassthru() to accomplish this.
Thanks, I'll post reults.
Posted: Mon May 17, 2004 2:45 pm
by feyd
instead of using a redirector for the file's output, I'd use readfile() in combination with plenty of header()'s, this way, it's a lot harder to circumvent the download script and going directly to the files.
Posted: Mon May 17, 2004 2:55 pm
by leenoble_uk
I started writing that but stopped because I wasn't sure if the doc would be affected by the fact a session was already declared.
I defer to you on that one.
Posted: Mon May 17, 2004 2:58 pm
by feyd
I haven't had any problems with sessions and binary dumping of files...
Posted: Mon May 17, 2004 4:55 pm
by McGruff
leenoble_uk wrote:You could set up a db to store the IP addresses and record the time they last downloaded something in the database if you really wanted to prevent this though.
Unfortunately not: many people have dynamic IPs. AOL users' IPs can even change within a session.
You could require a log in however, in order to identify users.
Posted: Tue May 18, 2004 9:53 am
by dave420
I'd not use fpassthru, but fopen, fread and fclose, as then you can actually count how much data that user has taken.
When they start to download the file, set a session var/cookie with a unique ID, which you can read later. Take that ID, and store it somewhere with the total amount of data downloaded. Each time they download a file, increase the total by how much they download. If you rely on sessions to store the "amount downloaded" variable, you won't be able to update it accurately if a download fails, for example.
If you want to actually limit the speed they download at, that's easy to do as well (although not to accurately on Windows servers). Estimate the max speed you want (say, 20KB/s). Each time the user downloads a chunk of 20KB, see how long it took. If under a second, add the remainder to a running total. Each time the total is over a second, decrease the total by 1 and sleep a second. It's not throttling, but it will quickly average out the max speed.